var/home/core/zuul-output/0000755000175000017500000000000015153136212014524 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015153143105015467 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000254360715153143031020263 0ustar corecoreƬikubelet.log_o[;r)Br'o -n(!9t%Cs7}g/غIs,r.k9GfͅR~٘I_翪|mvſFެxۻf+ovpZjlpC4%_̿f\ϘקjzuQ6/㴻|]=ry+/vWŊ7 .=*EbqZnx.h{nۯSa ׋D*%(Ϗ_϶ݬvGR)$DD D~m{]iX\|U. $ॄKЗ/83Jp ώI8&xėv=E|;F}Zl8oRi{ C2i1Gdē _%Kٻւ(Ĩ$#TLX h~lys%v6:SFA֗f΀QՇ2Kݙ$ӎ;IXN :7sL0x.`6)ɚL}ӄ]C }I4Vv@%٘e#dc0Fn 촂iHSr`岮X7̝4?qKf, # qe䧤 ss]QzH.ad!rJBi`V +|i}}THW{y|*/BP3m3A- ZPmN^iL[NrrݝE)~QGGAj^3}wy/{47[q)&c(޸0"$5ڪҾη*t:%?vEmO5tqÜ3Cyu '~qlN?}|nLFR6f8yWxYd ;K44|CK4UQviYDZh$#*)e\W$IAT;s0Gp}=9ڠedۜ+EaH#QtDV:?7#w4r_۾8ZJ%PgS!][5ߜQZ݇~- MR9z_Z;57xh|_/CWuU%v[_((G yMi@'3Pmz8~Y >hl%}Р`sMC77Aztԝp ,}Nptt%q6& ND l~l78:ZBBk`E\Ƹ#¿Øp*vxyPLSMY 9J}t/A`*t) O5]/* @.yhi-cS4 6"KaFٗt<>vRڡc0SAA\c}or|MKrO] g"tta[I!;c%6$V<[+*J:AI \:-rR b B"~?4 W4B3lLRD|@Kfځ9g ? j럚Sř>]uw`@}-{C):fUr6v`mSΟ1c/<g ;ruw~J03T0|9ē7$3z^.I< )9qf e%dhy:O40n'c}~d i:Y`cФIX0$AtĘ5dw9}ŒEanvVZ?B巻?qr7@sON_}릶ytoy͟מseQv^sP3.sP1'Ns}d_ս=f1Jid % Jwe`40^|ǜd]z dJR-Дxq4lZ,Z[|e 'Ƙ$b2JOh k[b>¾h[;:>OM=y)֖[Sm5*_?$cjf `~ߛUIOvl/.4`P{d056 %w ^?sʫ"nK)D}O >%9r}1j#e[tRQ9*ء !ǨLJ- upƜ/4cY\[|Xs;ɾ7-<S1wg y &SL9qk;NP> ,wդjtah-j:_[;4Wg_0K>є0vNۈ/ze={< 1;/STcD,ڙ`[3XPo0TXx ZYޏ=S-ܑ2ƹڞ7կZ8m1`qAewQT*:ÊN, Mq 70eP/d bP6k:Rǜ%V1Ȁ Z(Q:IZaP,MI6o ޞ22ݡjR:g?m@ڤB^dh NS߿c9e#C _-XѪ;Ʃ2tStΆ,~Lp`-;uIBqBVlU_~F_+ERz#{)@o\!@q['&&$"THl#d0 %L+`8zOҚƞ`wF~;~pkѽ)'cL@i]<ք6ym®Yi&s`dyMX](^!#h k:U7Uv7чd)wB5v-)s蓍\>S[l52, 5 CۈP$0Zg=+DJ%D  *NpJ֊iTv)vtT̅Rhɇ ќuގ¢6}#LpFD58LQ LvqZDOF_[2ah3[n )ܗKj/jUSsȕD $([LH%xa1yrOH0D"\KjPQ>Y{Ÿ>14`SČ.HPdp12 (7 _:+$ߗv{wzM$VbήdsOw<}#b[E7imH'Y`;5{$ь'gISzp; AQvDIyHc<槔w w?38v?Lsb s "NDr3\{J KP/ߢ/emPW֦?>Y5p&nr0:9%Ws$Wc0FS=>Qp:!DE5^9-0 R2ڲ]ew۵jI\'iħ1 {\FPG"$$ {+!˨?EP' =@~edF \r!٤ã_e=P1W3c +A)9V ]rVmeK\4? 8'*MTox6[qn2XwK\^-ޖA2U]E_Dm5^"d*MQǜq؈f+C/tfRxeKboc5Iv{K TV}uuyk s" &ﱏҞO/ont~]5\ʅSHwӍq6Ung'!! e#@\YV,4&`-6 E=߶EYE=P?~݆]Ōvton5 lvǫV*k*5]^RFlj]R#Uz |wmTeM kuu8@8/X[1fiMiT+9[ŗ6 BN=rR60#tE#u2k *+e7[YU6Msj$wբh+8kMZY9X\u7Kp:׽ ^҃5M>!6~ö9M( Pnuݮ)`Q6eMӁKzFZf;5IW1i[xU 0FPM]gl}>6sUDO5f p6mD[%ZZvm̓'!n&.TU n$%rIwP(fwnv :Nb=X~ax`;Vw}wvRS1q!z989ep 5w%ZU.]5`s=r&v2FaUM 6/"IiBSpp3n_9>Byݝ0_5bZ8ւ 6{Sf觋-V=Oߖm!6jm3Kx6BDhvzZn8hSlz z6^Q1* _> 8A@>!a:dC<mWu[7-D[9)/*˸PP!j-7BtK|VXnT&eZc~=31mס̈'K^r,W˲vtv|,SԽ[qɑ)6&vד4G&%JLi[? 1A ۥ͟յt9 ",@9 P==s 0py(nWDwpɡ`i?E1Q!:5*6@q\\YWTk sspww0SZ2, uvao=\Sl Uݚu@$Pup՗з҃TXskwqRtYڢLhw KO5C\-&-qQ4Mv8pS俺kCߤ`ZnTV*P,rq<-mOK[[ߢm۽ȑt^, tJbظ&Pg%㢒\QS܁vn` *3UP0Sp8:>m(Zx ,c|!0=0{ P*27ެT|A_mnZ7sDbyT'77J6:ѩ> EKud^5+mn(fnc.^xt4gD638L"!}LpInTeD_1ZrbkI%8zPU:LNTPlI&N:o&2BVb+uxZ`v?7"I8hp A&?a(8E-DHa%LMg2:-ŷX(ǒ>,ݵ𴛾é5Zٵ]z"]òƓVgzEY9[Nj_vZ :jJ2^b_ F w#X6Sho禮<u8.H#',c@V8 iRX &4ڻ8zݽ.7jhvQ:H0Np: qfՋ40oW&&ף \9ys8;ӷL:@۬˨vvn/sc}2N1DDa(kx.L(f"-Da +iP^]OrwY~fwA#ٔ!:*땽Zp!{g4څZtu\1!ѨW(7qZcpL)ύ-G~^rFD+"?_h)yh=x>5ܙQ~O_e琇HBzI7*-Oi* VšPȰһ8hBőa^mX%SHR Fp)$J7A3&ojp/68uK͌iΙINmq&} O L-\ n4f/uc:7k]4p8wWLeUc.)#/udoz$} _3V6UݎvxyRC%ƚq5Щ/ۅw* CVo-1딆~ZYfJ"ou1ϵ5E bQ2mOΏ+w_eaxxOq:ym\q!<'J[FJ,4N:=6. +;$v6"I7%#CLTLyi{+ɠ^^fRa6ܮIN ޖ:DMz'rx#~w7U6=S0+ň+[Miw(W6 ]6ȧyԋ4ԙ./_A9B_-Z\PM `iĸ&^Ut (6{\٢K 5XGU/m >6JXa5FA@ q}4BooRe&#c5t'B6Ni/~?aX9QR5'%9hb,dsPn2Y??N M<0YaXJ)?ѧ| ;&kEYhjo?BOy)O˧?GϧmI C6HJ{jc kkA ~u?u7<?gd iAe1YB siҷ,vm}S|z(N%Wг5=08`S*՟݃*־%NǸ*kb05 V8[l?W]^@G:{N-i bɵFWǙ*+Ss*iނL8]iSCQ&s~In/SZ % 'I Ƿ$M6rN+LxE>^DݮEڬTk1+trǴ5RHİ{qJ\}X` >+%ni3+(0m8HЭ*zAep!*)jxG:Up~gfu#x~ .2ןGRLIۘT==!TlN3ӆv%#oV}N~ˊc,_,=COU C],Ϣa!L}sy}u\0U'&2ihbvz=.ӟk ez\ƚO; -%M>AzzGvݑT58ry\wW|~3Ԟ_f&OC"msht: rF<SYi&It1!ʐDN q$0Y&Hv]9Zq=N1/u&%].]y#z18m@n1YHR=53hHT( Q(e@-#!'^AK$wTg1!H$|HBTf̋ Y@Mwq[Fī h[W,Ê=j8&d ԋU.I{7O=%iG|xqBչ̋@1+^.r%V12, _&/j"2@+ wm 4\xNtˆ;1ditQyc,m+-!sFɸv'IJ-tH{ "KFnLRH+H6Er$igsϦ>QKwҰ]Mfj8dqV+"/fC Q`B 6כy^SL[bJgW^;zA6hrH#< 1= F8) 򃟤,ŏd7>WKĉ~b2KQdk6՛tgYͼ#$eooԦ=#&d.09DHN>AK|s:.HDŽ">#%zNEt"tLvfkB|rN`)81 &ӭsēj\4iO,H̎<ߥ諵z/f]v2 0t[U;;+8&b=zwɓJ``FiQg9XʐoHKFϗ;gQZg܉?^_ XC.l.;oX]}:>3K0R|WD\hnZm֏op};ԫ^(fL}0/E>ƥN7OQ.8[ʔh,Rt:p<0-ʁקiߟt[A3)i>3Z i򩸉*ΏlA" &:1;O]-wgϊ)hn&i'v"/ͤqr@8!̴G~7u5/>HB)iYBAXK&4'h9Dݥ:U:vV[ 'Mȥ@ەX㧿-p0?Q6 y2XN2_h~Cֆ֙82)=Ȓ7D- V)T? O/VFeUk'7KIT, WeՔ}-66V؅ʹ;T$pZ#@L; ?0]"2v[hׂ'cJ6H4bs+3(@z$.K!#Šj2ݢxK-di +9Hᇷ絻+ O.i2.I+69EVyw8//|~<ëng)P<xͯ~? fp,CǴ_BjDN^5)s('cBh+6ez0)_~zJz"ё`Z&Z![0rGBK 5G~<:H~W>;ٍVnSt%_!BZMMeccBҎÒJH+"ūyR}X~juPp- j\hЪQxchKaS,xS"cV8i8'-sOKB<չw"|{/MC8&%Og3E#O%`N)p#4YUh^ ɨڻ#Ch@(R &Z+<3ݰb/St=&yo|BL,1+t C<ˉvRfQ*e"T:*Dᰤ*~IClz^F6!ܠqK3%$E)~?wy,u'u() C>Gn} t]2_}!1NodI_Bǂ/^8\3m!'(Ֆ5Q&xo 8;'Jbo&XL_ʣ^^"Lq2E3,v1ɢu^}G7Z/qC^'+HDy=\]?d|9i,p?߼=\Ce"|Rݷ Q+=zxB.^Bld.HSntºB4~4]%.i|҂"? ~#ݤ[tfv3Ytck0O ͧ gP\|bЯ݃5H+v}޹na4p9/B@Dvܫs;/f֚Znϻ-RBz-p^:ZYUv`Ƌ-v|u>r,8.7uO`c Nc0%Ն R C%_ EV a"҅4 |T!DdǍ- .™5,V:;[g./0 +v䤗dWF >:֓[@ QPltsHtQ$J==O!;*>ohǖVa[|E7e0ϕ9Uyzg%pg/cc6RS`HFLЩ LkJu\!`0);Sak$Vfp~C%YdE6c>1ƕ (0W4Q>@>lWN"^ X5G-nm.8B>NOI[31,j2 Ce |M>8l WIf|\q4|UkC.gr`˱Lϰ} xr.~l-ɩu_Drd31V_ѺUib0/ %IYhq ҕ  O UA!wY~ -`%Űb`\mS38W1`vOF7/.C!Pu&Jm l?Q>}O+D7 P=x@`0ʿ26a>d Bqε^a'NԋsI`Yu.7v$Rt)Ag:ݙyX|HkX cU82IP qgzkX=>׻K߉J%E92' ]qҙ%rXgs+"sc9| ]>T]"JرWBΌ-zJS-~y30G@U#=h7) ^EUB Q:>9W΀çM{?`c`uRljצXr:l`T~IQg\Ѝpgu#QH! ,/3`~eB|C1Yg~ؼ/5I7w9I}qww}U~7뭱ԏ,}e7]ukDn`jSlQ7DžHa/EU^IpYWW兹Q7WyTz|nˇ _qˍ[!;n ^b k[);ng]ȶM_u)O_xV hx h[K2kـ`b duhq[..cS'5YO@˒ӓdcY'HAKq^$8`b $1r Qz?ۧ1ZM/G+qYcYl YhD$kt_TId E$dS:֢̆ ?GЅ'JƖ'ZXO݇'kJՂU086\h%1GK(Yn% ']Q; Gd:!gI-XEmkF}:~0}4t3Qf5xd\hEB-} |q*ȃThLj'sQ %؇Gk`F;Sl\h)5؈x2Ld="KԦ:EVewN ًS9d#$*u>>I#lX9vW !&H2kVyKZt<cm^] bCD6b&>9VE7e4p +{&g߷2KY,`Wf1_ܑMYٚ'`ySc4ΔV`nI+ƳC6;җ2ct"*5S}t)eNqǪP@o`co ˎ<عLۀG\ 7۶+q|YRiĹ zm/bcK3;=,7}RqT vvFI O0]&5uKMf#pDTk6yi*cem:y0W|1u CWL;oG^\ X5.aRߦ[_Vs? Ž^A12JQ̛XL:OEUپOY>WK-uP0\8"M: /P4Qz~j3 .-8NJ|!N9/|a|>lX9T ҇t~T1=UF"t; 8-1I|2L+)WȱL˿ˍ-038D*0-)ZyT13`tTnm|Yhi+lQ&Z!֨řoҒ"HKX 6„=z{Ҍ5+P1;ڇ6UNE@Uo/>8.fgW]kY0Cgcu6/!_Ɩ} ' Ў3)X<seWfSv!ؒRKfs%(1Lhrٵ L.] s?I,HCԢ[b C-lLG+@_$c%* _jR|\:dc5u= A@kUc\ǔz;M>dUN/aFRĦ@x؂ǀ$6%}N^ \mQ!%8j0dUo=rh>*YȴU3Q,̸*E%59sTzɟڮ2kg ۱wEUD3uKrr&"B:p`\E)j<).R&#ÃecE,dp"nPS 44 Q8ZƈKnnJei+^z '3JDbSK;*uБ:hF ѹ @˿ޗ~7g9| hLXULi7.1-Qk%ƩJ4^=ple;u.6vQe UZAl *^Vif]>HUd6ƕ̽=T/se+ϙK$S`hnOcE(Tcr!:8UL | 8 !t Q7jk=nn7J0ܽ0{GGL'_So^ʮL_'s%eU+U+ȳlX6}i@djӃfb -u-w~ r}plK;ֽ=nlmuo[`wdй d:[mS%uTڪ?>={2])|Ը>U{s]^l`+ ja^9c5~nZjA|ЩJs Va[~ۗ#rri# zLdMl?6o AMҪ1Ez&I2Wwߎ|7.sW\e߂뭽*{Q=^]{Q_Trin`jSب-ȥ{Q=\;ju[}.˻Z]=F<^[UsywaoőP4o0F}|ŧa`57X_x!8 J&[V =͋A,z`S,J|L/vrʑ=}IhM4fG(Ȋ1{TT%41Oa'$ 4]oȒW/īy0d; hMk^C[դdɱݲI+yxW:MRQ4I'Jd_&Я$Wž*I1w}U}ՅG$ Q(5xGTebTھ)WEE2-nzO+q7dX֫|9:y?j5ĈQM0l/= v0jǓn,2 c׷ys,/|~AzƟk0-tr?#XDKO7uHqrjxIz`,B!> =t;6P=_<g0C'x$;N/'s[ӵցך xPȐ;*m|2.uҴ/uR/-6tó^;/<*jZP ~(0MM#6rr }|˳tOrm=V_}|c8~d.sK$_ ݗ^9wKlI(!ˏ |r.LW!dp7?+y₷Œ׳$n&Nl&=2Dmߩ'$P@&aH.y3MTb\[3; Ŷ;jx񨵲*FOv]ǞYLFhglݴ,9a3; {Y{dO(p.iz2Q"%갵u!O'AKy}]WEi0Kq#‡`9lװOGdM@yLh""E!Q$J`7"z5,G%xUYѦ >1B>jfb'PX2a.Xƣ !F)ד{a=ix}UOSg[RuUd(g1JrlSAZYHiR-j+uu*"nFد_h<3IS`%bb%䣸mZ?/q+]uw~=MMкN\4]4_$SpP 0ZG'秼 O` ŚQP܌:B+;Kxy4"g{[+gxZ6 )% 7NQ ĝ G2hDBuO]xPNyE#PO{ D*yCn =%K$P5w7 hI25Xd`!&юHRutU[;ǃ 4,O>tBíj`O>uQ] h'D¬/'4XGެ}i3|z(MբSl|Ѱͯ.- gl&7"::~bkm E FW ZGҵGܫ.N>w1 _!<;)FgqZ 'Ec "2p?_L- [j'4 ^ۺWT[rlE 1EOjhrZ8Y@ۛsLp~_Tia.s{"i±t|4VtX?(Xg< |saסMMzg? 9&cZ4Ua/f5Z|Bpuv h  =9"Q}*Ev."Ar9 ;3zӋ[t.%Dfy/x];qQ @DGWS$ySĢ i&a=@lo ϓm ڞ`ҟ xrA2sBe gi |M̟)''8dq)aۻ]6K@@T"2M? EGh6Y叺z Q6C>TE[^伬gM\r 1Nݹ {HKo*ٌCQ\8!e!}& `Ig /fEU(hpj$9m9ۖ7mSdTk-UmImY y?mp!b΅G|#>ezC{e<fOhL$YH. XOO?W )Hֱ|Hx;<jLJiq-"k %!EȢѴ}<SHPD,pVarI&$"BqPjpJ cwF &\9d>QFv |u*!VH"bId|xн)IMHb;X>$&` _`Dj**LoB<*5VQ'({ nlHqDl]Ʞ89 al:DJLI2a=!I]Tw.+ o eFSƈtҭaEfmؔoO 4d"MEI|?bo*Hy&ꒇDU~˳re-hS9jbuVr:jI& X٣HļMC*}ݫaJ 4x Kf #7KkIeߔzuʓʵemHLp[ x^'{= a`*϶ b3"'nF`"V!ʶQ^[IWLK =^NO2YXemN-CI!$=s}ksmF+bȴ:Ċ $*o0*=?=JZ(ߤ9aqՓLFq#SI5Uve`ٱtwcJqwf^&3 Շ# PB p]LNoim ہ!c{)uErtk&?~Tڇ P@~Oh0lf jeuX#q+8!ʦہpա{WoG&D10k& !x&>xPpalήakwB(֘z}Q}NŃ"ؖ(K^Lly8ik9"sZ'?M1[wn2ڝyߑ)6] X)>K܉ &>Ӡ&L~o-?I}^[%!NH1Y<:4(O3\?pٯO O<?/zkļ;em!;.>J?r>牼_m R.y] s6+μ{-9_u:k'4ER6ԑ'v%ٕ :bڸL]?'5qYOU[uBuE=uzXiCGVc)<$o2-[SŹ *@|[T$@ao/xOOuA?-E:iXV~bv-C1&A>N#:P# c> anrݏLJ~ ?$6f.MECTuF,ΑJyGWAQ7va>)88F,ۿ\pfb9C"<h ^+H r ]@qp"H4Nr@0"( WlOo?Uwbb<=PxGWëSBy0oW,UX_n nN{aP^+#OF߃h%rUX4:xD}^xH" e"(g.*V`qˇ4\;0 v2EH~bӥmZ@qze쾄0NOb7w1Q GފGAQy/#D~?g!7~BLE#NWClGs4ANZX~D# K*{t#=F ǘMA1^_\7. /,r\h>q=1q Ψ;gOvݻuv}`X8R|6Ι식q}wpM;}`>6:9Tn3셃 AA+4Ics?ɸ}Iz8niu݆{nit p9q_pK#p[cw(Yn8?jq>uNbӞDwu| ,^b8V/\ߛhmY/ܿ߃(JΟӳ kvQ r!ڭ @qz C (^e';1q2~{/M]ӓ]n7p]DDc|Z xݞpb}q`,^bea6Zx} vO<Hr=pV綻nlb:lKPp5))rAҒA%5[º:Ga ,~BkX0F!~Na.0HR<,$uB/i)H݄hGU>V mLSa&s&0N@O#d_]%A&=?(Z<&&!$(UlaI9mbl$_B )0`]*oTXz/"=ȰsrGaZIψ5i} {sӫ<C mmw0{ 1 zJZ+',*fX 0T` n͑A klA)tiU Xj,3WI˝2 2S1 fGqIH8Ŝ"u T cQ(F"h]p d_ņm=$>V ^Gz` ^LQfHu8 Qe",>`uU 逈C>׋ż F#6X HϞ熶.ˏFkX8%hN,oHM6!JxPUq5Fy,@Jm䖟r bɇ'@~G bN8Nc+t. x9 `'78lc Lo`]- I\wۯW0'&ݱ9t M0 XdF.m[tH\E:oPg+%Yp?C^8A}JIfiABgX擒AGfJ5UUlVUd$_|{5] 7/X`t Lj#XފuխJkZXX:&*5l 2 +Bu :{"WsG29+iav`Iu,r|$Up֍y%mZ]KB7b@9d)Z;Mfyq9(e9)$j͍w8m>dPX~suߢc|tO˫ S vAWwUo514q\Ӌ8P{Q˷k9% CLJ^8܆ tXzLl &GCY簈YXQۓ' 5<2\~?fW=PcpDIk K~ ziEUcxo|) Ôb[y [ sZSҨb>OmƩQ[blLpQ~jqaN̶gÂ?:,<ԂUc"1Zki8k#hDb]YՈ\DI5 +HMq1Hl?c4 `-/&V\ˀ&.x|<Ã~`0,piYzvkVtYx(_t+4p=}4؃/&4LG^\n/ I` q) fVfG&{rPzIN@9hw0Nfk틝Vry o pŚ5[/M2 (4 ڊPg^UCqUYжy;A l& yU÷P&@(ݝPOT^OVSwN=}n>sw]֦@,۝Vo{HpmsȆԿ/sp]iʺ9tOႼFEZ&4[~̻ ©79(65{`W8 줹rJn\8_\U϶m8\.ҩ0 P4tO1h^ 2)B8 CyV9hmo*/.0ҭ NM]o H9O"/ CW9åu}8>E1z"_J#$Zq [D{o4_XTVF`XN(E=RzpL*GF0}`# YV0iY}ُ-7 .F9,*[.PV0ONW$\ND+cʊɇ+n}:I^D C备+3y,|m7kr:]oKh K,)~`T-6OS71h)^CΆ[fyO 98n&Ye:j0ӥ_PDא";`!v,(ii'4LgRqRHS!wLx6}AqL]|:0`+#T-zSS y'rÔ5 ɪ ,?sNstۋ`Q=n [ 5MĬz)Ҹ%qw\( Įj5~, gu95&eQ5:VQS7H9H۔(e4Dj=J~CErН# Y8:N!5t)./*Ĝd)z`OL\dY"\juڧVo44~PB֌WZm)k Cy5IyEB;(o SXBh)xaǂ #֜`NCę +ߠ tKe +4N/2CQ%bS LPb8D͚(->krSw &pWZP&Z+}t+kZthAJq^lJo ], j>^ܑ~h<'25H&SIhT^U>JQLMDߝt52ΰLa? qCU0P@~Ld'd?{P˧E"!>d? ů(/ڦ]QE]ij+k Pmz{6Nﷰ旵/r?0̽($6aVdU%45{{BYH1C<^g7*b$ vb%nO?x1*\wNrv+~;UT L%~FgVh6/ ZjCnyϠH"uڈr1$_I )z`{LBKy 3/r4g%$ 5vwNo9~u`$?nr\WsW#BS:!V}dP2H%`JϼֶXK&N8UT*B=,l>yY[JJޗr 5HFh;B9$Ibj_K|9ITrTRRZF8Vpg3SG1/*f%oeKhxsAT6U!azʖ)ցy}IH.e \0az̗Sv#R XaUFTRvhcF^6Jdeix*Vna%+rʩ2,G|"N315dFH /K\5S!vNSGMt7 ,_tR:N=X䙐_= >?GCM-`;ql^};i,'Ka쟳 ggXemYcLyq2^^b!_Yn՛r4^,^}7 K! :<%Q\jRԠ){G)]KU׉b"jUc$)`X@I#ux9(}GႿ s>8;#+E ՘1Hj 7xQ O wwn2{vRi]RK誀\Uv'Q,A9N,6R&K&4 UX 2;*Qx(qq* amХY M&I VXltIjGb3Fe,΂_gBNlJNxKK`um,gĹg*fʙK?;߭*T孌;$!X34ڊhw| q>(>=_cؑ؆o$=Ac'b浾&sQPgO % ,Smt#5~eGH{eHҙPZ *p!jR AlS3iO!ǧͷjp =kUP !ZԪ/&&CtK||:(?3p1x*OS  KI#P.CRCϹ]^{{V[U r фTԽ<1 I<5C0e$GDJ`%8PA<߾3}f^=} /ZW&WLtvv[8ptnE6h#?D0OQΖ^˄L1Ouu1B9@r9XcX"^*)U!Yc0B뙔"Bt3%Ow F溔%2&F/{3y s{rÁsa t7QߐCPetV%PC}ϔjr@wi $9¨.94|c kF6X:O-| 8Wk]pCJ#\ˁi;&1uKZ&Q0'݉9k%FoW Ch403\;<1k"5|M3Fj#VU>~̤ ;;RrPʷcqr+KiiHzI &"6D~b ,>pTa@PEMI:4tC K\.SV[/rn}yR:FzzzN*pl0j ) ؍YiҌω,Z>} :ܳ $"HCzXsOݷ=X4>?c4AO l]X:ŻxQVyWɌ0zYyt)B4B1pӾ@7h~:3(D3@5//ч51.Yhb+W_Xpm Ǥk#?AcE" rum(f(~twmޏJ17?s!cDHDD׾}Kg?_y_U!ER Œף7-~o-HNN$!S,A|}+遃H%=!SֵmϏjViWb~7+ f]S1Ek:/'OxZo| DjDM"VS#Htv7/=do?Ă֭Mfbu7$YgG(E[I*`?6'0&376 VlGlb7oy:YLw,M@! ^<1a5JSC%FbbU9w8mfS/NU |Gu‚r?MQ߷R\Y1adM3g#x!ne cVZId2Kp?"L햳ndKs A*J.X?79}]n{Y^Ify+KUS`@@ ۢ2RŃ~l|^ ~znbv.i*3[2C, 8{EeARb,4]K}q[Y0.=>Ot)1Gg!M$S,΂䱖*ȏc3; Qidq7fлLYtoD Ur: PX6f5'^k-j@*X8YT/R\Luruu*kX:QfOߧLB`¯&m)O^L -rĂcXo*ywaUoE8s5y$#H&&X9BWV!Sٷ@ۑC3߱uMV%hHeaʝrhNnyS%M~Ygpɥ;B$r XTP(gMfчUN BJb z>shJj8RZ[|0ʯA\7|񛒥yc.]X:_)30Y6n9u3VѺjHGf[Q\r~ӥ2HElqP^$(5qtWn‚)jYҡ,65X _łcp*/|ue/>"&Z:1DjYMjD>Ӹ!oXo;d w }yşz)\';^Z_ ڒWwز7A]*U9l_jMǣdb_f+)>s!ЉmH,Nػtb|,%+W|FK0K'dt{Pr^㸂™Q( _^~m1['sΙA'gNViժ|"v;#vfNJʋY׻>)P?Yׇ$s`|dv)@ٴ=.UT%ǹ᧺(<"d T,1GC+e&7L( ?3 &U=J-cT+ ~jD%.jW;v]ɇyI.@qb|8Ǜ Xz.ǂY{,wi?|gAprͥ.&,(M=Ǘ/,8Nj9 if ]Р.&NP,w<9rU ) gSU\վ&-brdɯJ|,XN^*xik1Es}Wc;ɝ!Q"`C#/0dڈb\Կܰxd5yRLRnf5#i6$<;cah$HQTVve)V( %g:uw^n1H=F*8yɥM' JG0s[г콥3/%+GHLD{š`캈bÏ.Yp\YTa _aY~ElU,8: BF pck)SBJ]U$S,x՝eqKwBY+z*Cy,E'FƹXcВ6pH%OYϹMLK)]oR4-ijpJJ r};$V$o.Cn\NAl#a%5cKЋOK>ͪNsȔ!_88}, 8H;wM{goPx5EC"F6!8 H1īi̇bU9⹗bh`WρczY`o.]~IQ>*!N^qi.T$rv5{'7lcAQxx[p* ඔ)d!N·])fZsaR#K%}Nc6Vq#p3:Fv62ctZ\@̯ؓ 6-Ո!+ ~;S y(ZUuGp= ._68^;y%Dxs\e\hS2DrɹKOŨF~ ,XbRc-G% . gyubYp)>L,L~ o]w7y1f֊mߓ7M}hzV"`a<!_|@,{< ߬@Ό;N8QѨ{Z+nBǃ|^p̘=a3cxm,~y8? Y$D1Nv;\&iQ;NgSfq>i1g%X֭q7nN>zc l(vҹR]G7!]3ס(:<[:g/CH>AKIrn.&~^&nw{ NcM{ hw戴t(af14dzy1Izޗgw'Q=< AĈ rA] @Uцv|}/#7Cӱ@t$QBL=Jƻ BxiN8" L܄)Kގqe%k`kl?=_Jnw`~9Uuə-Yp1> M"4nȹ{KTg[n]37%wY^$1*ހ9T{EeQ%7TahqbhjďOCvډ %xpBY]Wo|x 59M?Y~k3Z`f}KB~iF! N8 ٱo>.NΦ7Ǐ#JF%r ijBGݹ!0o+W@W3 ^5[7h-I=vg@iřPM8CڹNdH0lˠYt-WVbVu/mgWo4NiqS:8~pޝ,7&M{a%2Ѿ+ 9]{tpn ˟&*`C_l|xׅ[{prՆa85Ws,ߚ3VE+gQykV΂53sf^n$ӌ)M:#Vi|d҃/\710r.k"@mlfE<0B8j2^[h,l/jP.vqGB8F-]֠h5'm*k)nqu{A ZŸa5FOT e-$U=l!vhe-Yxlf'Q kvE67-0м]  d۠5VYE!㵷<&mpv1挴h j-6e9X5̓`Mȹ|bs9V@)|ȋI|{dC*KNTXK@bd/J(nO~r-FE|ʘ vazGPA֋ G$ 8!HjD>Kx[BV.w_!dJWV4)lf#C=ehS)Fbū㫤ɒNŪZj)Ab#Q<~3.c\NCJ/~gUd)c N@8xBiNAi| (=HC!T;i覰0/@*oxun->ЈlãR4JӸH?.7Fi~ kB Lv ~}L@},%?lA{nw \Pڢ-A`՞r^Ai fz@AzKVHrJ̎!W_ IΖ_ҵC0vl:O‰;^Sx4_j*P5]h@7kuh.4nFT p\F KaTf9hFLe J⅕aB1Ҁ":@ޘ: 1zʭp. ܡ@l0}|peJ Mkw w|⃹88ޢV}K7 _ }n8~>9}8^-LŘGݹVWQP"r%cez8} E4 7{Ò+qKĄܞC(foN8( }-c&B5~?=SkYuBՠЎ=fLVbʬ] mÿc ;a54p1Ąa"|wׂ'?E!d뾡.6EMĭ] zZ:{ 1 R<.Z ''xDגVmV$ Z?]C9t6LCr~ 7V343st}E?k(Z, r_G}>_P u^ph2|ч檖Nh}NWljxX`u bHnyr% CZ,A%/-_Ν}9H(s T0dVXJ6S˽ՔԪ u)WYZ&7k ;6rΈ- 1CF΅%XTSFkSN }&QF)USY 3Mj6bQ=DU, E)20f!-9aBx0{&(g퀢K02Ty")*4#`M8?(SÎ8 󉯤o.d!Wy? 2<33McG *e1opOI_ Xi g\uޏ"VO7%~ OoFVux»n|A{BEq{?{H=1Wj 0.G{?tuW؈z nu?@Q͸u "bz8J(!U WxxGF^b]KW<}v홉w~4@>v<A{%2g-co32BzVQ'>:II$GܛrY$EƂ"G/lo!XS5rT+F#'T! JǤO!FSЂʆIn(ؑDd8ya9€eh7b< \BxuM-/l(2(e@y& iR`>n[93i0A޵#E00]`"ŷp7 7EA?'$vβbK)ɒXVKK"AV}U5[} 0}Qlz;#$1m?ukHhU w¡, M!06).X@k|@է`U)ѹmڲp[Iל|<Gsm>ikD7n'\!4@+%=J8Ntբa^VlJf]ݨT(5fT*@S JdmmBרFuk(T-˺uZ]Z\uY+׃N((ז3#pn0fLXjזƨG.* Una[ODl"Js Y^HGhkօ)zWWW+\l߉#GVq=4$ٴ Ĩ(a<5Հ5`#Z pϢ&"Ny:9VDɱ9fXʒdevLUռu̺Wf3貶m]WU6O"H'KRdl6^ ІSu.44إ8,qX]wƻCuqswuSSnGT>sKi0نɸ7+>Ǟʴ!8œ1$•iQ2jI/z<1V#[]${\QvDMZ>&.L\DRH|ʘbB *2" ]F#-.Gh4T9PNxn֮ւ +.h}m .Xi*Mڪ6Zi$#M҈Z) Y) i-™BWږ+]F;êhlŊ*)j]4X.TL9cM >᛾-UMerAұ=̇*J:ЍT wu]pZaMiuU9P0Bնf‡:z0 N' 0Dh9jǴ N?z{2k]zmNnN=~_}~Y}>?/ ? M~s'|ǻ:u`vAO9?byt/vRN߫}Wu>̠2}X >k :>oG.y6\l$S3{IbXwwJʹInPj&*=jŅuu#mU~ i9X$VLniXEycJ0}_6D&@"qhTSV834lrI"z|ZWMͧˮF?<ILs*NNfw>78[0 ,[~X~Rj>A1ݗe&a*)0 ]4Opmʂ#,E r=\~O^NE9ss6ɻb6i~xy_™u3Tw77^>U/_;kw_p=qs&!Łz1TFu5"nF#Ϣgsې&&} \܀;rTgnyv7kHIjX* \p…Xo D2ZoO|~ŪW?q|nú˷'{֯)/뺹: ѣΚ}6ΊPM.p6Nߣk|M7> _q >2 O)ޖ꾄bWG\{7rOo(P67jw--vY1]zF0}w2/[K ?~??n N 0T ^?svvvA2<Y aAc_;-{k_s3z2jù8z8V^S/Mv=Kt?]Vԭ9kUM5]Tt7QPmQ w "_c[Tb}TpAgˁ }́o *1b=" 1ǾvVA}6z o #PqI>4O)xj`Qӿ]Wow5e~{\QS6ş=cЖKK h0*^ԦҲ;p%Ps)i(e+ijn>1hJ3tݳ+vSDsv6w3wR& t%@,ҏJ=U Mrpj}<5 ׿?k3u<:M0&B&W4%V4Uh=ZP]=8m!Q?ʷ X5]QNf]~5vk4s~' .N[IX M1{o۶gH;l6ܢk3zOtv2yASbV*"@gu.hz<2蜴ہY%溇-5)9 aMW(hE`:0hŸPg HO&m,ND˥HT@(Oi ȥ(s';S\xu)}Ų k7i8's^;չ49^Zr2;A)KNk4Ju'(h%m2ҝPˠjv';Q>bUɖS2랶YlTtL֟X$,f4,LX4ߟe7oVoA ƄqE0Ų>+y,KH HR܊'eMv6DzceuRޞj4͞C⩪t.0(~B633;c9 ;B~d֥;HhCMh5NymẢ(v;lϝVL?V,~57a4.}jxq ڍH"`f4 MM-DAP*# C9.@o\=nYhC ll,4ًr{ά4*{]|Ǽ]:=.'zrRJaH >kVD3CҨ>A(T}Ǯn owq='H3!Wωm)7MX?ݳU.w5;Pm TA͵n}Nfa۶MU9p:|3u*FqS:վR`>Fu`$#µH*\.kCtsh< *@%˄},Tw܀NimbR\GH(|r5EY /VO7S5(n3h|qeS#I˥.WHQE#瑜h+],#N4bE# 2)q&z/ޤs&'ETcU$,}<,㱷ߊ<p׶eߐT Lp$?c6 tpmxKzá4㑔 u<$ LnT64N`,A{"j)H%3C$ex8kɔqK8dʸL7$8O'w; VY&7;*RA7 HTGiMH5yD' $hh-;`E9΄xe)S"uät!<%ٯ[4~D<(̖w,/0: Clo-|6!\aDLh/ϟx|+6ӹe`:N~j)xNw =ܝ0_(?Y?_:X,{$=L6DN"H)Shs7U4)xDPR'ݛ<Η塹e܀H 0^놄yUŭ*6y/L9ógl7q#ӓ`JClSẴ:$fxS O H$N !ʆF \Ty(-Dd`Sn7jq>/j1W-jQD4Ft>I{ǯ#2s\tBT:Z.lJ%0"K:7霯IG%@{xH5Ɵ%8P+Jb{Nt !m]S|#%mJ2/KoX/)/Xsv%Q";Tyq, 2}ςzGO7)E0w$p4=J\xAӲ0P;zӌ c^}8ՑɶB޻TВmͲ<Tn^>\ LyyҘUFf / % /"ώ Ed¤0S\0iChaLqGJi㐚%(0(<"$hpMZZiJUG,|1SAvG<$, &G=~y\#tWݮipenE4AlC}Ёʼ"uk-*y/V!#0P!QbØKuÐup2WS.: b|YI9_l#!7imL̤]S\ko$VvK!ynHd`/0`,άd*XRW4=Ӭ=<&3[K~y$38G#IKnSWroWKf=#o֔Y}CK=FNL _=mg>\%$9l\(Qqo3ZkFZޏ(9䏫@qpg'kzw uJv](:Phۭ@ra^8݋"`s+oo z}r]k_F^u[0q?voVG.8 ?s9>d^ V{g6E:uשJmN}`BaZT+3RiJO#zg6kՠCw-x5}_n_?ޭA+Y[WQ=rطƄD]l/1-;p,s;UU>\_;A97'U(Ɩmܱu>6Cc 16A͸ch8毫 ͺ^qN3^q^i3e5W\AJ\WKK}jIZ-iT>q)SeP᎑M1afJ\{qܥ~>)͇t"al\ܴ M+̐}{y wg9*{,V.x' L+]u.vn^#tMn@4YW$:gqV39 H $cs+je,1q/S)/]\\nߕ2rf}w#6vUmeAnYTFT f'\VD|něDg;&Bk I&F $R$ r[ QHyTI֤q"H>rD6ɱ62qĨc:bID\z#5D鼩۟Mom|V;ZjK󣪹" ,R[loj{ѠfȓX 3>z]'b{ 12$S$[l!<2c"Ȅ4{_&9{W\/QVR׶+Cre)Z2[\!^ FE 'Yq{hW$CUVtRpARloh{)HZLHl-m/GGTHh {JTn_@굌{XFd|xx)ó3HC6gLF"ZrZlo^qLf$Ac9|z=D+qEL ykF2etml9]y}>ɺ\_Sw5>՞Y/iCA*5mc[#aEל}}UwxKx_Iq}nu8IjNXY֍C UׁuxE_WRIvvRݲBߪi})H.;`iOs{"̉?ib+nrU,8m]gEokm*}.à ܐ0{]R\<5I\u'+k_B MX@5@@HAC'l3r7#lF|o Ư M>Zbs!}8 H!ص_>Momm<@ǿbݷT]܊TmZ^cƵTkEUO[WM^g\'8{_Ơu!czc@5$PZTU'\V)^벮| Z>;ߠU ׭_?hؚZLm[1Vm#i׃H u /y']pR]ջX *ۮׁ#U5"eHAI 9Ւ>pՋư?9zD&<0k]FcCBR>{ČrE|7375q"@!Kqr mRlo يU9J287"x 4OGN [<3lYe"> I&b2H>`KG LT`WfKB9|l0+Vmoʀ?^W8BN7;(KҜ p$)H8\8b472؇Og׵6,_]mLm2ưc[SbuWhZ0}u" k],,MOmwQX2뛸7.p}ިp\AW:8$q+{ur2$ gLSް%y&V,$]:(Ϋ^- [Lbd|؁۝m묥2#eǴw6C 2 ƥ|2F>"W);T.iRnf|ϥStKwvw̳ `‰#IAFo 4ơQv1wX:ޅhPDBlA PS`2BcKHf>WDI Y#aҐ٤Ak2H>B*=z}gyQ;uYWW3oj,2t8)H>PcM: {l@̝8 @ PNV> A.b*} } q}>Dzo[g3s$ g6'x-nMɇ[^ <1\UMeflG6NN rD&!O|8I"߫Y]BBmゐE"@B8P`KBduhA#/W]F vORƒ`J։3> UAAulPU9؂mjvj PEyLP$h,xrtbv& PxKAMPKJ$PQ<Eu3O>vOYx~%А2(a5gR|PYGU{o|]27OƐZ MGYԡi7%wH_a,i\p6ffn J:v~EJW,q8҉%SbWRP©^>NpFƅabF:)I-h/7~&Pi02F,;9`\V`S|.8 jI&9F~\QfōKos Q/ ´/JAB#gІPzY4š+͡AMa}8U*4r>ͩjZV1hz ߀R)[ e%PaBZP V5/mʜ,RHձלUFŕ%a ^o>'Wʕq\Y+*\˰Dq*iz9WV\ \`UWmikz}=b낕jmm^TL 8:'IB҂:![fL^LwaގF2ޠK0nDnt6H*] K;7'7nuPϓ(DEĹԥ'_v `48`268‰W ,h`uBdD8W!sB9}قY87Jx( N3{[ v5̰~޵[mXΕn֪_-Q7&q2L]7I[Qn^뢅O?EcHĶqyӑ 7:^CM~!tb[)0E zq.s}~ON/%:"+xj*|r+`Rִͽ6Meڠh](VC#Qh $Q;Un| ߣ"ui^_Q0N&7 alcC¸7aT|M퀘U=02b$y03  lxqx"IZ`zJπW2VC2?NJs0fs083`d1&Y4%H ]ƴ´P mvˇh@тå5@1HD=@諞P)P66wDJM9huN Er.<\fcP&%ܤoMk8-5qaw'.܃'zy0NkŷkKEo2%W C}h{}Z+2CpP݅)}پOG?_-UMeog+4" a׊L3؆6/BZlt-4? Q }u<K6!gHB4e(Hюh?]yGZOSYܸ }Q8XUp—Z#o pVT%ٔN|[Dp2 ּo :y|* Lj9`O4X&Ʀ*&H:C@֥ͥ8kND`i@D=ƹL5C($8S|LN4=GԛA4|R5A@1 F Ѐ0 v>cc1Ğo {vg=-E"V@jɁ*ÌT|[g>dIl6&$cVs)IWI0$Il6&iIZK-VK'7d + mhlCc5ey+ Xy}pC`0? D h2O?ʵEaLR/~q%+}? SM~~gx2?q7Z(F _2gsn^sZ,/<'H{kxq1}տ44Fayv h"HWY^^.B"?Į'\K@W/  ɻmW_ ݵ] lhUkWj-lVC(d6 z20R'ʸi'JƊ >:> 6ơD7uQ |%$bR*.1n(s>fPVyWG{;ee_p Kߡc$.#Ow?|s q.m֠|{7+*ؒ%I%'H1J]I1Of,3IfjG.)Y&, B( ҝŹj/F"U+hjuv0F;J_ ~WE8:=\5| yP2H<E_F 9Υwxfrº]o]|$;SloBݖHcO,B htfN EbIS'M2g&"Q?P @=9'a'&_ĕDCi&_FR]r BOG7 y iaX'{wwW-yu4;u+zU(ތƃlYo0"NƳ7"E:Ïq`C|__Cͻʠb>HLjmDf94Ng13p'Ss~rPK`ɉyBq}D<-O%*9)OyJM /'(0^yaVTtіDB鐵E^C [z2!ppyNBP+~䒞Nnxl`^ח)o!oZjI D񻎘 r]O`OkNRKZAwK+QE3ArF igy-U1`^+41g{(戮Av%)y քHjR ES`T<)AuAE)z7S;%MQ9%88"e6sÛon4p㚸!pnsW|RrKUG.yMߢ74dC{hY<EJm6F9 v&aS;rFwK74vYcr1^s.[&~s(1ї=^!G~D6US]mhf]+jLrEXui}vƳ+f@EGyHH3 i}=qK!$BHɅB1TŪyɧ45V 1Y԰UKraV-q,.{Y 1h3sTD[8p.^.;폆.mϥƸbM4jvZR`<M<?a/0Y.sU˦%m87ؿПZ[iy 5F7A^1 |9Lь'= 9Txіeuz8PuI%]tEY96#W)#/.)Dv)rNi!Z`c8M /pRZjIYOF|t {H_G3Ppk$8rٴ$y.=Y߳!U%Nkqo.ZiBK AqAiN~,IY<.zaM$cIRY.E+&~-2IxC;^EA_gRWvC; Ujhk GS䶾?nwF?o_e^(o[\yjv5g#Eٶ0z^SYô >:ntܯ zvZcsQ_?/\ #`b[*߈Huan?-_G/U53e?f?b2l=|-a.?"vщn:&ww7ع_J2#%е-B ϲw6Mly:dͨsܼRo57Y[wj+n~sa zxQޛޣ)F*|? U(٫l~Y>#!ܪ+f9z-~< a?/sb՜g13A+cdL?;.6R3XƨЙYj4.LB"E&e/isƽ?}"Kj(oE|7 ߍxߊ.8Peo3q4b NgB"Ti$X"a( :9hHjbB cĦIBqJь)LTdwq@oS6ʰfegxd\hS&U EqԷڌĠbܥ rT*-7\>9Z(Ds)(v#5*`f*-V3ckv:@J͖L$)*(L8I udƠOi14w KT tfƌR h13HaqB,q O&KbkTDaр MtK!UhĔSPw!a+ś)ZBg8T0=Jwqpѫii%&S+}"!8hyКQF"Dءu0=A֎>]&nD+m`,MV#J 㘛D:*MOJNJS7ޫP4uPXHqIƌ&rR":`cab@d]k3h;cjgDbEǰbзkJbIkpj))ŏ),q!EԔ+-$$Xc[1`Q nx 簮nlg eenD)RwbItSbAbadunX"\s)\Ns`1'ʺhLK ƺV`TRT]E\gFU A٤<;D )ga)-%Ն,dx@ٳ]b-`PܕAEX|/bO vBVϹqVjBĊ5)@F&xD68(*\#7I}EB<^TuS ޲f]ɘ¶B$h5MQ 3 2^BX8|PB`ҟJFo%ѕE@˥W40w<!'e[}%0#V X;K-1mbuN^V3D4%!aZP P~3E`{c%8 1%dsTs@k^P!9KI`s`6x#cR1j ӣA?,Une,5H̔HZe&Ce@bCEDMA WEa^iI7tDjY"RȡX & Lt9k4Q FF#42˱o B)BMs_z+"*Bl jon%, \-/Vp5Y 0ĥ 94V0ֹ<⦗kW'GhZ-4mu2l1E]&nn4Gnås0'BwG#bq\a֘M_k>hrDq5RVk dLA\mG·+-PpPaRD5P%TQer#`sF-vY.ʘ6#S :B  R!!K3A<rsoYܣˌP@>"$TC(Mj@᪛-3R^zaU è=W(eQFRĨz12.-k57.TBmEw*>@fT옔 ZI6)g|IB q>tIJN 0\k$ګuWW,<!L0P5#[Fmq/nZ7ۭްjC]G:LG>B^M~&"Z߇hh82%zz}}-n[s~w r{P}Gjz^>zՓI p|;ӹwO(_sVZ_kX~@_f7@ϾQ#wW"u`l$^",Ga9 A̟P(ȩP7$OuC3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:WB0vIB\Gȳ坆CJ!;:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:Cl:R/Ir:+n1BC={PZ?:EN(PguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguP;e~7Vr7={Iݨ׺o4-v>[g ˴7Uy3c9(X#X{ V+FY1f!`"YXR,{Yϊr,%x,lP>K;MWBj酀5L,iCXX:|UNʅM!y1I!h`yG|t,lk}V ku,,cY6ZR&3[X1]Kqo,_خ%Âu;uP0j!`lwưv!`q:BqllDL0#hdZbz?lGj@]~ޝg@OLOjw!VBL.ƫ87}BX#X94B~>YX\z5owԏ `% n!`ŲXHpww9~#X sn7Xc;j`-)!B2KqG&إXVhw 66+r`QYe-Bv?[eR%0' WߞYv黳(_&7/K| L:x㎧6߹qZ{OB{w{U^_ԎOkoȞ_l|_?\J:n(zRZuHəRٚjPbBIV&jQqb]~ fNsl>n`|'g~~r^ˋ'9&Q=/>͛Տ? Ѻ.N@nH7Kt}2Q"v5JFd*Y.{Y_WroAek KI:XJ lUVX1 A^X$0 }#Ӛ7i:Տ õgɋw2xoxu_Mgevt˾@y&WX^J8`8xKStاO:09xVw3VB~Rןi/u$]%[Xvz)Ѡ?}7K+ȷ>,wRBWpa'؎Gh KOۏj;^'5aw=<Ee1LNLK+bZJBc6+hB~{wvoןބ|: dD{A7 S+LY7E~U97?/^mub05* 5{?zqgݓ'}ֵ%޿f~2;Ӥ}1zE77Oo>gp?^^:vջ5ߥOO9nnzoo"_~ą:=@^]ӣnOsq68Gw[5NNkŋO6fnv&Xk|a~r/   7)W{U|W!~EY|F6B(*$Q>l+23 |i{@ms`kfGW  7z7F@ѐo>Gk VDKMQIe] _Ro 5{|{Եmܦ3- +oIDٚ":J.-q.<;{/_ߗ$HjU I4L{~ZN6,1brr_&tXɖjRHjSz#0t("] KRG%-I3Ւ{(Af;i'V|5 5^u`l*4QQ4m[ѱl \86y}vjΗZc%뤱Z Ӹ&0))VZp- wKA(11/7c6R)JR-" h g@VvR-"w:IFGpa#+K›Lw(NΙf.qާ\1ИUlciKˡI$qgYau^iT^@,+k^!A,^Ә[V^|dSC'Dɐ"Ɣ(dW-=?R$\=nB}Ve-FΔcm¬TE*Q-_TsAQ:{o1vpAsucض\ !X-wWHK~ `,;ILI ^=E*$5}mRDG{ =c}ֽVש)9/Z|c;xZXkfbWCH G-Xʂ_*XO*RR"431[YY c>qeq1QMX>TffxZiOa 1 ;?{Fymu9@  G \]r-ⶇ) ,eķ`46Z.Q2^SZ /Y4`YEB'p k gj`oȻ\I1#~hg\ !;vpH {A*A$\yX&X+akI#yk၂9 ?o;ts\1(l)y pȤ``gZ9 @nQES`-"-7/ PR2AAKHSѝQ3Fwi,t\Qy{X)R6 i;U Rwٕ2lk3%eN+3/=ZQ($WE$i'E0*Dk4P]! /\0dȏk])a4a7A^Wch7a,3O+f-|2/%vN#O^x߻ia`;r^[ir¬̷Z˻ )M<f/bvf`tdUWтҥ`r낄CKT{D6L<&#q &`Ez:. IqTA"&$iY!YJ uP*1zOAAƼs6$`QD>Zb =N X@6|55A@K$z0yC >|hlu!DUuMކVhGc3q!2Az5l< lwg5T"m^9#laܶQi< C @Ne#s40qFkrwVh#`:: gBi4Suodz2Z#vݑdDhÕ,`y+ s.JK4ؚ&; 7]iמEw4YdJ ׀,"޼HmJUΪ^eVGaAW0390 "}ONr*C:)&k1ҘW7 bܔW%5b<5-1Q]܂V  7TLLkPpRuZtkH޵) 9!  r31{ҋTJ3|r7LJ$𠗈śC|̈ȇxy@1CK6wٹFsYmDt A(<"!U@%NOp'rk@=`5~%ÚI^!|ZoqE rr<Pbv,v!E^g9. 3)%@&}*@$4. Z`(o$Gw ʵ Z>@ Mg TD:F  PX˙-+Zqċ6 у iF8|g,y@ɌS:(d %.I%n% EnVS^m6K'`ebBtJI-bJIaB% 踜f !J؄t^L(z"a!HTv#DoaL1@]or/n͖ް"]p^5Jiz.w ~6598 jziFV-ay>/[%Oe:9F,祻Lr^h6H촵wyÇV1oX @Җ^\,!qPl@ڗnOq-XN g7> @ XŭeCsv(poX˽7 5/ 1˅0VҾpZ\,yk@) w> Ӟxẗݩ0L@_ ;T" rzhP-7Je`m %7"\j3zrUe9SZ:nbqFiEy%nU0?L)_Mozn,w{o/ku"a2]\'ypGsVڠxLu=zigxy2:vssmoGnO?*~ˋd͂6ֱ}w+Z{\-mlI8v:K<qvmS]IZHIB,utIv)I )ȣsu=N08}n?Ł-rtuC~V:?[&+]xn̷[|qq<ϭ3WxJ|'\\illpaWJ{$g@8(zsW]-x>1HˤR- n}l5`|EaWy|!?nM;hroPg?IڸSJ*UTRJ*UTRJ^#U3lmmꛪp6J~[4!m~5|_*no w=Fg9ߓɼ7]xU>Zl=dh+^C7m%B!֩Ofc ([TI0 hXFXT2nD<}d,Gjͅ:$Cɬ:ڮ,BڭQJ';R+bWXclEe96p^9w6zg!ZߺckC;ye֦/m|C~a;xrW.1zۂX~[jW{mM\o㸸Kvh֎?{O8_qNSuwSW[uow[H=|}u ˖eGJ +S釮N(LًDz:Grwa>k)Ļ%4߮"{7_5O{LĺfYh Ń/oRI*X- D3 +9-߇ߡUkŗ| 3QTbu/-w <#2R 群`R,/dxu lgumUP+q;ʤs̱?oޝ!߸RhdChUEEuX+V=d9]FdvA>*}37擾/'-r6B+:(w,ࢌ tYWhL+5$m>_qtqKt4fZO*ˮg߱mz!NwK㌵dNNp` @'yvs3q7b+~Y kEm NFP2āM!K|NGQJ,*5IB(]v"]E@V(qV-:ߖ٧_=ַU(5ҥ'aPAnfP1Pɑ{cpXrMjc#R'P!JyAd ~da9S,iGy-|$o%3ٖMU㗳N^R]l2 5%=4,h>FMgcTm'M9]m\8W& +/ %R ,h=k1ZŁcl}yrJg W*J'mt =1c[Ł4=6LUS;?#_n?,`x7|{wTԒ92wEY{‿'?,vg}wE?~`fZD;;9a06rBiG% 5B2n]5\ j'/ F^(>yϿ69q^Dш' \bITNC /˭-q~Q^vr9+n 1d=$p%ʍt _z5Nt܍P/oxsNk=~wBkzӻ煀{wOs<7J+NlP}b;~IZlwmgBր*`s5/<,)n}YZ~^-7^'Nj Q^z,.}$ߺ-Czā =Z*E _ƒ+̌!<&J24R-]'Ccr q#Y.tkP#Y)%:PCA1 +J`aq#tQM&t,]!$ u Nx`e :$(>Raΰ$z_E(>%J9OxHa %^#J06RXOuFˌ!Av'x pop!}g'B#7j|{%{P/f m"˖n͸tnҧ"MT=>*``w6Q1kҕT%*qY_qi:f|h)sphAP\3M9 bt>͂e\ϩeu+s4ꄜΨkdcf{ቦ'В')jё#{bFx }3V.*fٽrwܵqwqQJ >E>_N%/WU$u~NW~38 WP<RPX:>"͉@>PcI6âGXDIdY6?XP)2݁#\Meݯkw%RsEXIzW)" P$=489 XsyBpRZ >Dq59!Pc+5XO_e#7[Md\SVMvX n-ޓSm3 -IH*i%cܛmn!FnlSʼq6o&Crjt,~XmI p V 2I92Pbe,JR&bthDiodպ䔩kLǯ>o`mi`r^¢\5pۂ.v;M#v#XkQ|#XP,$8i#E$g5f|8vӱj `kՅ|qj6J8.l$heLbMz1vL5QJqUbHWqP V l&H& юBDzܱέ9&۱|J%~գs>/PXY>E9,ݕ|\Ѹm7>*˯o8 sUCc7n!.7oű9$|Ԁ}m0,^Z,B1'=46G;rFҷy2 eqPR7IV s`HKBzz}U|,_ɐWi񌃦*;]TO$l6:vPǨ3`ap&L̹@ײJ `:Í6bHfP9 |$S= Ѣ:xK璤Xx,CܾAz͝of0<9komΠȈsc#/T0D¸tyB) 5:n[(uFR|#wJx( c7 r.3`6=#?z;gg_5 <ߌζѽj}$C$ 0WY= 4PU|=?Nh mi\B\/4 Oa062C:طl@)$ {uY Ɉ|!ZCP!pٜ-y? ]`6%w[d˼ /*ili`.׶%?,z (q~_pT%j)X `4\HXŸ/.972C|Y x Y(JA?׷u=Eeʼn;Uބ?.MVZ7͞WU,uq|LMW^hX꡺X,6ӢzLfq[²5ШHG)=g)3ZO[v 1C,e<{Q"Dƺh4.tK6v<0ow;xe!M*M^4DsrbI1kbyԒ)RS@.vEvyT%V"= ˽"iVZމ]-vZQH)  6J$tK XV0jŀNPخ^e=E J.^D9ꌗ締Dy5Apy4.b92^-z?#g 0V4.!QyƸۃZCa- %kťt|k,Rq 2dFP#t9e>[0ZKºq]XOL#+0gX4HV/Ut@_cU& =#|-Nz8Q.i\DjgBO!vjJtѮʙo۟}첪^DBI^֛KOrHSgsOal2D' D.J]~YA`{q/*vs$@l9-ڗ^MCy,|zw3ӸaeyjYWFMzXԌ U1b(% MN6e҈q2i:-=BS*$U~Ռ(̕Rih^EhRr;]嵿Jdy}w޼}gg||ϟ+6UDi⒌DQpӾ]qxMuGbDz\c9DJiu`MAd7xoI;Zkyw4eV&(,WX~W<Bu1&Ul,z|ƌ_c攳E"FUHsqx mˇ]/eTTqGbDzyYCvq;l^DeBu9×Q6j)(V {hV(ujڷjY ")Dɯ>b2|$ |T"ıdH' dR$R_,FS S >SOisE"5>D."pДz3*ԭMYU۔"/i#Cj?F}5~WWEAhVs`{DDYxX Lue N7lFn>vBe7[ QUm{b8YyQ>U;uo+ׁyP?ϽB ]!r5(Wx*H{h4 GaAe& "iQڒ2 ^IE(, VH(fH[ r̪%;83f |,w'pơ7$IPDj΂f euKJk_DZP=ˎn2]ԝz?Y@E $AUIr}ׇS~MYrPDVHn4xQżQdC ޗ2flƈ3C+B=e=,^9רiGM$!6KdY;fV`+STRz^ee*B.cKTĠ&G~Z9sJ.kTpK_yq̀-> O1,~NNߘKXO>ƴ9A hi78Zopa8 A {(f6O}X=5dׯ}d}x/~0A4:}]YVvg,&ob>1)N͝NxቦVKO, *5|᎝>Ҁ9/vnnR3IN>Q^(q wB:#:cݷSp|Zx8xOg pNdAuJzXab4aiFG nquT4ݵC0RxduW*:HoW l敡2WKf% wI3Ygʜɫ$aE0˙ߔX[h9US>47:v+l<1 Gs>~Ag"5wh/sUQP,a7D'm܉:kO8!9!w%aj>;֔*PɊR[ v~u3=Q<*P}Yp0nKKTmZc>kW.qX\G̸Us36|(Ն1:_ᬞu>ZQW:p RFp$Nx%e)t{"f /-ٴkQ׬Aj&tzPG {n.mC]Lkk#JelvvH]@ \cM]'jbLu\z1CRVURԘOE'.!Ϳ>fߙQG@ٷkkh[)ho)ieYW ƷR-VzWozszՌyf ŶlpuNZ%(L^RP}ePzȮGR:aBM8lFd9cnˀǚPW)Xi;6`fB}Tjޗ"+،'2vsjk`Tù"!|R2 ,㡊@o5_vh= l1A+U>O*: -4lKK󌄛b2Iٓq$N#jBEjDCP'7*ɒ{ms˜N91*a*߬Ÿ`R#tEU VD}cΎ1O}5GŴ6j1.V'xo: MwZbK >{Ԛ VbUp<M#q tEt\MY'c6$XP#̘hDK"$_a쳲C3HƈɀOhX-q,T=prװK&G*#ppVDUEhׅ{Ǧ=ͦ=(,O;>ΡMeIDƚ ,^sUE d sd x_֪DPYWǗ͗*:4G lqd9G(,OYM< {z KU%שoчo,jLƘâRn=Jv;LLJ]qhL^8*q cCQ ws 1G97YSfEа*J39_K`J bJK_ ܐ̈́" -kTsb~Vr^LdXoqG;~R7a)MFnF/2Ӹ&!pB+oБH+l^RRJۤ^q(bL(+%E ~ujQ {t x/v()xl}_ _%}+B9\~_6 J<7)R:ǬChC XD Zg$ևp鸄g%Q ,>dgRo}-r;Dq\*H$8:Tp\C<Ғ{.DyΪ ȤxՓwKyIK^6x:?|'XfRuȂ F`q!e2.¶lCgmܥO`RT7r5]sx7J/Xzw{=n!~ ШT{޹٫1:@Ʋ:$WǑ2mG-4~I+jpUAƯqYk2f׾o% ܿ!XC(2Rt8rHf/mY_?x 2Et8`՚Yoww:ײ$7ޏ f PU7dcbh(AUH{3k)m(KoE}0fQnR)ٛuKAO5D)ڰlp%Ͷ .WJp0ZT=,V4bF˫9i>_;yAQl9kTstETA a.aVfT*6 -|5`{""3 IdžEr_!f̠# U]aLX^‚XAnO11Q'Z e" B 5/9>c¬G?oVBU|U+%eQ֡ݨCh kֽ4f$ZWR,H;YƟպK.>k9"|DSZhegtpMg7Tyv?p!ZYs0H`ukiR* l] 9^x$FaFtĚu78r{E 1["'GcHK4fDCF&ތTHlUf]UT:KX},?8eEpFTA\ N 6ˮ\͌l17 ^H;|.,Kƒu EIr}Wkm).KB#-soŏ9֞lehT%`z. 8)X`hIUg%6^x3Xh.i}(RDQŴrjbR&Hyc 9s#hQ%w9{M6:'[z m%#s' cstLeb133PܙXRDrL2*%[ouĈuĘ%F$3A:E^&~Jn$Ϻ"{AFgRtVW r7FE^H<R y]6"i(}J? jUw *WgYpQ '7&gdrVKRCYۡ+@J*08!\ Q.H̜誅Jؾ쪕\࿁qH܍`Q?[h$f-d^ػ|U2,fu<V;b68Y"ghsAk11[h$f07 *#Ipc!jyj|~%Bj# \\1{vH(jP)ko+x l`r?(+ ],\YX`Κ22jxu-C)&jj]  ps,e3.Mk}[h$fNXz ""_9d!(_J42Gu|*:\c"+-43g$G1 |5ǙsRv"Xa`prapй2ʲ?5tux{#Vx`=jaiJpVԑǎZ3)^5,͘~VY߮-:su*Ď.˸p[cho/|~/m^3Dky6W){U;&ۯD\Zʕ\T+Ģ%KC(Q0VxD$j</3Dw{b%JtO%z0ښwڭR< (<pΨ%w,0Ky-Kvy!?>[p"Q(`ܽ33\Q6[w} }ؗz}nx)ky1YZbz˛vz (8pB=ίN)*?B3aiݧp` }Xڭa5f۷ =.'#uZ_%g+*wbqpxO"Mn'$MFs;>jދC(B0:j}hT%riBBlA1n]x'6ַ>u89ZpgV]NV BTv;*l*ylOz&Q(C38NOrV5@o1m2k~xtY}8eϨ?!F>8>9y6#5G>}anNەq׷S*5XNDY (#iM" &o_NrM;_% +?/lS `/p[n8͞_/JCJ8"i;,h>ܧ56 i9(en >c=z5`/{㸑_1t wIX$d_2TRb$RI10"EH*vK'I5J~}9:なrU_ xf6zwCG]{7eƦ[B#A@84'.& R`vKS608l m +ZOWliEyTԁxDYGz=J]XT9s} `Fo x؍({̾%mHjlZ/d9C+ *RFj1Xw! =$Q9cg&/;79}}[_o|p'D5tr{7,%ĊjQ]V "R|k"HT$,D~r:D=V*ۛN3xR,*gal&EM!)-LtbT/y7~ߑBnZ_(ybD }3*؍~pMiROywJD@CsG};nL";;t]TMܗZ?%@'^$+R<7$/"賔Z˜\:إ^mbFos<1i*RI>1it OZj;);GnC^:-+P#ƃO(EO籒 D- lj$id#X:sX?G{_]o?4ߖA3/5e`%կY3S='A}G |_ͪ,]vy'|zqi4_*7:?f){E:-_$L׫ Vz;# ZC"e1s՚>{ OYQhv9?=egwыDt n CT"yu[X:{7T!(_M%x83O\S[W>7;NzDk`YJRRl2gAKN^f7}H00]WLYacy pG` WZs:"bo)sɒཱུqp^%)pJYAVLۜz.s\`oM 60l5㕑&ȩ2PEVϥ^7k;Ӭ ]?2zY?OIWdN-:yn2zW_W0>b4S&3咄R("M[f1TƣPb"mT >ɵIkfRh{S(5㰈ZVLr;å7Yjuq|y0&KMFU7MxR*Zb-;~ĬѬ8,H6, ?]i-2sBKk0,#{2x:I4Wv:K,ܦJ]ALf؇IgvރU,U!T W'&+'obU2Ut`Wo+ Xxpd1>"%b%xV-ﶄdip?bra־|3}Z6^ЗUKVa,dv_f0` 0cʑ⽾8Ix2r݉ uo~>FK~Sq8j7}ݨ ݩN.<¯fkg$Y7Qg!CMuF`L`|PMj0*\-h1j1\jc: *16%fɪH^qE:b`w-u>Cߊ ~2]T0O~p?B^,>pv3'5=xfټ"eUVt}1u9̩u!Ícxf𿗚_],YhւƯҬi:F嗫dFَ0ޓNqFI~<8.3Evouʃ"< qcx/xʳ\g?'}q|$yWP4D\lQS粲ȹVJRJbjHSJzV6jC 4cK5ʙ Ƿ׭ձHyw_`%deHseKJ;5ɯ& q:7/>~im%77 QRya~<-G?^N _+ 2Y947>2*Cq™L$Y;)ܳ@a!}AMKgTšN{'2L {pMm<)Ku&?To O_{2zDidʦ&h'kIX1ڔ n j7%:Gb32yUT"vWnŊ;kLjϥLjQU؋U: U5U&[iU0ٸFꤋ]0:>\;Og~ߙ]eO#0 }R8-eEe I]Q&<+ByWĈ ŭ&l'l] $14^;jjFU<_4чmL.FI^b : e݇ jf^SF5-|5,[`mExV% Xc;Ft1)0jҺ/J!Z% }X%֙TW`Q Op-\m yNn@RlNEZsRz^Xd{ =ukϖKÀ3߉L^3u }L}۠{\Q's/ǽ24$O*_Xm8RyYJ#:C 4Ou)8Lh1lq|%CAu0Ll+c{z)j\q)oHeqIg+s)`Ga^.0^ {].*/ݞDx@u`|7;l塍,d,Cn2l3mJ4-'?|l>mf3Dkύ)qL*_T]w+O A6mB^|=8/9^,?? ǵsE.P;^ |HCf9X:AdQ2Q0U$WOV+7c >2sv7 Ta s+òb{p\9fg7ח3K/e8#ʸҒ`x8$AwHq"58rG4mJy =QOl#;ڎ}8$e߂mŶR"5D.bMo^==_!TeNvVT`YC*8pPt`7mR rFMb\7O\)3uc$?\W*/0qjpme54Nh%ﻭ&,\9 ӥ{z}|+NNefl~5>1^r_h8_Y]'Q}.#$~`BU Fġ՟#Sl& `4Lb*b ?ጅ|cySqX ӫɌurus+oj)d)>M.rld2a6eF<0SV(.i#pPy]I,r.UڻEtdՂz0IR0IjXUU^JbMBRCMLFͽF6zԼ:MYĥ5)/(^$\~ ՕBgǵFr G2@*,q$΃pnKSPZVcS*iRnan 5.nE(}Q;Vanis֦ ! 26FQD(0P,‘ 05X-c0v [S~Y'L f"tWʑ Q86/ aEv VT[0&N K :TnSCiɅ0:Үs! RYTSJEh86! N%Q8JHQ!z"z 5!omm*G箃DpH_! xCn奫6b0sK}zY!jcie$`bBAhTo}XfC7Lʞ? 6'JW y,ČkH?}דĽL}0[]X s2bKU\a0n:FL˧ҡ~4Le _tjUXbV4Ya`*l89ד jZzĥ%1 0Z{Thb" *_0]Hz#񰼞Iπ!|3 \¬Cf q#  'a5,p_&h]fmb Ӈ: (Ts1 %$k(D,}M{첵A'UZi,0qA4"/ H0Jl40 ߍ{g?]f25)Hp;ss5XJWp ㇙#S B(0hElpi_slJQ?8 TO0&ѳJWXQWj- KqNG, Ь}b #b༠0ED$Vkݐݴjƛ'J,=gJems:E,`'H 惘SKi&ɟ!\fu=`+lBe0U{]YteڢUeJ/oc^|dRþ$;$ r7nD RZ YW?YΫV qtn+t.f6f/657Mna J4j`[ab5Xa-þTSu>XR*&_ɮ,QZjF5*}>o*RuOiJ BѐUB)/4qxM/+u,Sdxo솉Ŧ/جӦg ^\ֽE EH7D.oj@=٧*i `4M JYsk.`a"xAYqKQ:F|n{dhwNSGɈPdў0G,{PllkOuEv&J‡QOLG{/W6h'YxE[p,3ICNO R 8dyt@⁓VvunZVFxM8JS'2^dd0r7D>0Nn٭'O H1\S,%k"cxt {0߽W;,NOo[W ޞ]9NO7w_gi2N;_x{^6ٞ[4[&[woKnwxkn&iLyOʻ\}K8V9#to@gOXͳn|͕sIuН/`+_\׿/ kҿ^/9P>z|V`m4 3\?  ȎO[tc\Ag"Ka9v|&]?:Z^w3{.p0WO#L'SbLsA7OJ2w$!kt[bamLD J!rP(6D*6vn-%3wv{XAyēáx>I# oHG!4negf\ z@>5 FFMVq_Ai \k)h?iKP4MrCWm<f-0a^;y+Gd$ܙ/n*N%l'Q~Q{3/N&׳a{q;néiM7qq*7ozwKgrC( ]<"!ViT"P% 20eE,L@B1~;K(QA ŭ*610`DRb$ 1b^H<5`#8%qű)PQXK`bQXZ%E A#HV˝{ǎqÓ^wӫu I*$0 \[24+*8dԄ4khYV BDL_2㑍c몾C1pцji Ebibn-f8t&Ose:(K67dLP {EQhGXqa  fHGIt}uv~=9α^]1B,r: j ƄVVS Q`@bB@Fj4} wAz{/;5lCIx)S!|wLClt`" &0͈!\3` pB؆m87esk樻[B ,a9104bFP 4Z @[+. %CX8ĹqL#g f % ްkׯb%Rxsh'{ gϳN^tsLW{'޻[WGW;ĻߊoR,}/q?^M"e(|o0 mB`X _0 n<'?`8wmy 㞱U>Ǧzs$}7k.L L7 Lw!rsus8;9s|ھ3Č8KAvwmfS?ϧm#>{ a4^=Qrr=twsRwݡ\|R~]*j-&q;+-ʊ.4Y25m9m6*??%<6^xaH=uan.H&wxyD.dLdz2tk/Gv1w8%_C'>f2YwSkG'&+9HE@iYDkt޳e{ r+1+忻cmf''~璟ܯ~ +9*ߟL'iw/gĹF;v :r}xU@daD"eD``d$#Cp8 \~Ĵ[`̮p a8"{)>g<|YP;ӺEtN0]c ;WH`v >PI,׍sҫv${6ܥ,Z2\)pyElv__~/w2PAܖg60/Ga6=MV}PǛ$Lj/-i=S.`S[4g`hmFiBppf_/wVzΟn>szavʳx/8}^1 0؀u` G['+@[pnRO* tQA}3E2^"-#lJ% ܶ!a_km#GE0{3@%ಞ ;frd'(W3WڒE?ds0V"U_ŗulOua 9>5''؏(E]ZBv {n˵?wh 7ԜO9: ?'MiG>˗W |×_ڲɻ7\U{ӣhtzK5'q 4f4xkᤰz%zF&10 H tc.~`7:b20ɩw~@~ub- oq?~1E7סk3w}{*@E-V{Kl n~ ^҃Dx14xwGW8z;EgWp*%՟ɨqg<ѽOgPB9,2 0jZ%Ovq=RlhxY"2ȋ.6!rig]72*.)Q~G1Fb= ޻k~Tdjd1آ Teߪ\Ċjo`_F'^<[#5-K$}}n9܏֥ۚ{Soo8&ոejI94p_")qA;e&vmkm%Bb+rSs"Wȕ+r\"Wȕ+r\"Wȕ+r3u>cX3֫q3q#ya_L{b1zRU}o}P=9C%r()S7pu9?KUo~; MFQh1.<[Y!u/[M1+m9D!GWϵ>i~D'ulͷ~(2E@+F G30Z 's5*lF5 r<$4TyNEpngpY!x y⤷AoIF9[8n㚵ldz9%5n$lȹ!n86AVlܧ&?8 HۆHA A( XJ( {dКZ Vddv?Y-KaOF}P 0O rMKy֘ҌxV#ư׆x.Sq[e8#tf+t찻P''u=ΖX ȑw eTQϚ#=85tqP*8(/gو,ՠ[(_ټ_pV`슭v8ń_,y=^Ǯ^b4qSeVQ^=+rwOn'oʲ_?S.uXkS<09 Ϸ뭹9[`u>4j ʘVi;W{Ծޣ[*Dˁ>|迭fփ ˣH΍0dSwT?~2WᬁD;T_oNƱ}>l\N3/+ʋa?YHclt5z6|\5 懌|}BwBc7?dt7^r|C2;rn-8Sv!uX ~.])n%{̭d/S(^ȫ~7:4&tj|\> K!Mt+ 髯uDu@R(lwH FR/,;e5 < G dEYL|鏣}؏Տ *{c?O .$ up!&( iA]b%.M8<ƨ V&}٣j^ 5_.Qi{S;1Z=i =R g{ɟاB%tMGt:#{ӫu-(g0cNj; NJl' k:ތV7#t}fh"cvJՖ@G}A?%%WYN1W;Š ;bw ({K]bd(eFQWB;J- ?%ic6m5SNEX[fl3KZk6-UGwZ0i4ة~ۦIBչm&0rCˮ>yuִz٭=L^/W[ l5w_bDuJs=;7CF_HkalqUWY0܉`]?mN0L:=6,ⲃw&''iZb6ɼS1Jy .zn1-W0|J3-{6vCe'c({?>2&1Hĥz4 C||Rviq moü$f(|q$ Pfl (`^j3B(s0ozi㴲jncB5&wBqe˜X"] 삖[ľf3 4TYZ"E ^|;$3OD1i)+\ ufx kb3]($`^XI=ĕ*89%5A4-wydr$ Q5"5߅B P:1;.(=jF dP qr3](`J<>vq6%EP`*Z@!q/5%JyXV"MC+q Jh.R0/LrGEMRD / V.R0/Q\"S 0Վd#HXսHB!Ԏ #^;t5x`B!:L-wyX[=GD+-ǰ\q| )7XvlX s-&)4*3B| I7zQPH ə""tn,2N QҲdޅB AT@kf1@(ʢv Iט'g>""xpD\?TPΘgQۘoyIDjrŌw^:4ZX  /Hۚ@! 󫊛nǼTxI- rsT9qK( k&dj9W7 *"@%tetyYj*F B*Z+ԀV[@! O8Ke0q<*bR (kQx](`(Rm "dA2 }K( SXbeTSrKPR: *(J̕"F 2](` :#6l)\q]/8mA&K| ):KPVJ 0o%@c8>.R0/,a~zjr"rT% mGUC p9tG&r֚#J4%Jm$s\C 2=s(K drFYt:Q]scbw`Zr3 ,x6\Vv4G\s`.+VUrpy<.^:\`sLX誢}VUENtute 95^Ȫ;hI:]U?':ruח]UNWV<]UDW˷w d/MW/|vh ]mRс3J[jWדULʭU凮nןT+ÜLaE NrS,[qnչsi ߙZeiEΈ{ ي;?iڿxW7Խ:cԷm}xvj>Tb﫭56x(Mioyní*񢱀RVУUAm3/e`Av,j?lزHNSJMUjzw?ZJ4:#^";k 6Rg\!Ϟb | pz2YܩLj1Zu:!D+sa81#80M=iWn"䫥;W<.#[%ƀ ؍g&J6=K5tTG1@cR^[7\1h*]%]55]zx]VZ*Zk ]]ræ>.f͂k>~_0E@X_?'WE<d^Kkǝ| ((Uq;GI0¨/Rv2mv}ned2Nbu,xqh6|rs8U#> sV?}W䇟~z3YoA8!ޭ}|Nk2@xl|d}S.U=,:b (;Tr}+h!d w)X 3{L%uk`׉sz7[&O3vQ2]_.s\y8x՝񷿖R|9*e" +K~#xm ?dAO6KX̩PhG;sƁ_ǴS,RWڍh92l>QSFiQŽ*`FCWKl4>-NW8-R#],c# XUFZEOWe!‰c$hl@G%^պ"Q1eNVG9+\ƒ9+ZqQ*Ju:UdN6/V tZBB;]-芟jWZh•4h8t(-t#+(CUKb,tUѾԫ ۡDWGHW$cc*+`mFCWcc+n%?]U꤮$zDtUrcV(iJa1l ]Uʍ*Z*JOtut%Z< {ni[MO~[uR9WR(Vn<fv,ٶгmEi)a5q9 ،g-µ)V/*J:HWV:,h誁;molP^rT+`7WUl4ꪢ_ VT =]/ΘRa/vp Z^PZ1(+q]]Oj!GDW КZ%]#]iI]I]'SC#LaE NrS,[]|֘fίk86UgPEk|jW5uұ#0Q{Z-]d;|NCoՍ=e͸N7OV%[ݸֳ~|%W_;t4wX|ոŵq#+k囕~XLW]6lvQSI}!]ۮ{O fkLey\Uijn|gn}>")S9V܍EUR(;I#b\+:ydJ&]kFbk-AV^jb^j=7yiC^\4Ҥ?6[ Մxva{%5VjڇΖ~aU0`WJ(LUC3U&]_5l|ؽ4ұ\K6  lh8e Y[0+CΎf/ç+': zPktnSVNx|L^3Y^_x~{}mPԯ6bE *FVmg{wR>M: ﺤqt|[/._p?2~\11/n, NkV.Du_%YˆO;.o(ăṣ)źɺzn'ۨcW=Yf?!`m\~_K]V_@?Ӕ?}wv+6카F/WdV$(@:8 *zґ>yo>w˗-$IPr!MN羾1MW7eʿ^)2d%ԣĥay%x2VhtPLRR8okI6 \0B0Ќ+!/&Ű}o;M R[oB,BǔɅ%s9w1#ao<2)b^ |#Z,1n`%R3IcəB$%x'SAF0 m ׍/!6R)JЦ(FČL ȤDRH2I+aN"ݧ# $0¹&D34fQQ;FKIN2l]{M2lMf$f1+6 A@ 8c]6ԲWLH.M毇\{H<*ɀ]@ ^itQg'Di|͍ΪHPGQs,#DO fE8ˊ (x(Zeu}!0:RNgG䳐fsi#{ؚs|߄֡T85$XC!QsK]b}5PY̕3uH" 9FOI QXɒ"l(9+Lgfj*\$cF1F ԲĨp>JbRFvI dIy|[(!*S V "8$2eh 1AkaZh ~TțU\:&̠a`D!'XU:TMY "` ( ,!!a_FRōQ!pPB >$XY,+vhe CgcuDՊbs6V뀠 LraoeʊqsuŬH$jAQQlXӉ8a9OS\#v[?ƋujXJoj*(BYo(X)ց0|<{AuP!m2gLtn3Xiayd/փ  CZBX[xkmɲEia1a1n,`P*B#Ig{.e˲c9Vbv0ê[[CfHA@8MEUd!*T?yF*Ɲ(@VT%gf HVE//7|>L4Mowy2WmKXZ>bM=A!%DB>n=^ ѡB- :$pZu $$]LAQXRrY¶9PV$iǂ@K^/B!K !ݫf ڮNwZX<3 I:Z Jsol:{% YDirL  "Q8TDYUPբ`QyWB+; DFt)X_NXjͷ]6@fHڲhJ& h%A r"-;TjJ1t뭈 k#`!-Ak[! QHV +,a 1dK-R@`=A;5ns:774M69T$Y my4qf;trlѓйF4IPp$rkfj[Ѭ)+ךB yHY]hhPAI(g_n Ar2#vI"vàDy X 9-)P.O(7"Fh-8(D˩Z wzʠ- R4*YڂDe;!kQ7Q1l]`v5+BqҔ*SurUpm- wH{ ># U@ &tB)*2jڴXQTl:00Ru`h\t!i#ZtS{[N댑 ]H5A*EZ(tϤz¶՜ s@׺M^noY%~BW:)D7laJ]j*C-*mP Zx;XA^X)©rltAiaS/B\čt؂U2'ʽ֞ JO2[,%-CiKEQ[1jxxpQi!XטM%Yӥ!D 0rmFvljI]f섊贝Pb}'4\ 5=J弭T'3twE#BPޙ#'~`AA(ԋBvl\-׆U#BhlhlP Ds%gQ׍醙vj@A*)Pc.?J#1 .멧5/׿zy._ŕ^uJ+Ausz?reZ<(D`|Eޘo {+:_ѝڎUxOߵ}H~cbFh@rZ%(/nDuBx3E|^xSoZ@gXGwM_LO.io_7:ZȵkqBJ}3^lFtH<1v?-;@I5(WVm'*U{XmNU-J5MA ]C؇HB# 6'$%JBGSYC+6ߡ]\BWC+BHWtKj0tEp ]ЇNW_tutBá)W ;Zݞ*HpHp ];|uE(%2zs"Dy&5/g6 GL.FKZ[p:?mX t<~E]}o8@?;؛g>7Vzn(8,^5;a^ڮ8 CWf(t<]J`mQY7\cP hчNW@dd:BEQ Cj0tEp ]\PjtuteH]Sz(tEh=tQmƘ{p٤^כU5tؼ)oݛbOw/L5lyy+m]5HݦJ5&شMXoTkzrb4lK}}np`0г-r5/rw ^qo!)bey[ajL6z83 CC21JBg4~@tJ%z0m8xJ-y  Cp` nC+'fuutP0 `i`mZ%^]J I]j0tEp ]Z}Pt+Ȯw' %sMW;wCI]R|]9zzn7+ swB:t"1]!])g+q0tEpR= (LWGHWEgT pá+Z;xuE(g:B2HŠW 9+B /6=҈Ȉa7DFan,[qޥ)w ԇEIzuXoplA w746&Xf]K>EZzW^ OYr\?VbSA6m iQ R~~#_wH WOה^4+}2"AX/i2G-vÛ3<pk\u^b}G)`10^._W,pO\S(^\Fҏ^ :Vn%2}<NoF -jSYhޛѠi,Γ6%Y(b2Nrcq"]]p%VRUjCZY@]F͸z7bSV#j/}n</0*so^N&v/u#?Y i{2tч:- SEF6'3|/)]_oӊƦi#lVpѫm`E?gvH39wt/8<_iޏN>Jr'H>˼i^uO}4gw6#m)-G'նŲ xa<Я6_/pܢ{O!N5C,N$P!3y,'}s4ޯ-|Ѥqͼ|xA^ߧW/jy&+p)8b&+8e`r69M&$_?|6{'ׇ"̃L]~7oZ[=xt2K=IKoo[W.֊ݔ~rvmĻsgA>r/_F}A m$[zdBhrm 5 f-u6+nS7etbIjѤIyc&S{xLFB(D۴UuY&r69q2*lW2K[+a)j-͗HיcY2r;/tzbu7о_59`6<ǶijI";aZ>kiZ~%轚 kx؁RgJ$ !sl\]cD'M&V;\ݮj泶s: <^ 3[zB䞠FW?Tb6[-ud<^|elmwg*ټăI!תymZ[d+vBڨmp"wfr[>QS f@YeU*%6 gtb6m |mu*T->ʮDQVMp(OE*R㎒)wF6WqnZǼoYrqSS-|hRٛ-rw}W}h *g @1eKUA b|m>Iv1Io:64yr yPuNQV]6>(߯ $Ӆ6/Lj :8xv}@!pi]`7k|9 (׵TI-.);&JU;30N 2'pvm+ٴWFJ#H)]u $RVQxKK*I/x y m.}xkXY{4=ȝY>Ѭ]W]ϸNdTGr^'e5բzj&hFM]f &kw5EզvH,FV|~^~Ϭ}N19±4eNd@1B#K!P'wZKObQߵ'%H;޶l"@l(a ؛:@-.Mh[R$IZsH=h[II8&)s̙p#H@'*7,:!ڸr\MB y"EcT}\O(u|2w5G9H%rW9hv12 l8%$_5i,\γ6q.8.MVTr\=g>\k.g 'el(gs}d?Er*@4,:9e]R)l) >MɰɔSrFA b)bO>ʆ_=]9+vF]*#wT7HWș[#hX F-C=61b֊Xi)?xRd񔥟-<ЕvzGHCY+&R4 %]͙EgǨn] rT=FWYn=-b߾ⱗM}v/:<;nep9ySG߾-̘vLzeܐkJef*wv~Od3rJhiKLHj Q)O,uX}a U)fv[P߼>wf;3HK񳇖yx ϋlzU;Kl%-c> f7Q[t(5黉JdDa  Q(k_pji:@$ڈ+cR>d#/\\ Pn᪅XF= \\3h%5W4w\U{ << %W[WԲ]Sv+j۪V=z+k/ U2᪅b7\V`(FbI D~ql4bZLU➋VqxBh#&B&q:2߾.?E*b0Lp*{jyn3&[=igWđ e1ܭeh%i8ʷ?j7MH.C,,+ 30RgQy>`aiT)=O 4c `vK˼TD#_IpogTArߦ;~ y/8-^k͵z9[L闚mV{>d7JMp`<8n _,Kլ; < %}^m.ØcCjnG-r7#UwQl6b0-ݶf->KwE4>-Jh]qA.uexZ=-Jx}rΡۏ{"퇸\. hҸQkl_X͛SܭJ[KņkK4-Viç*l.U[BPw%rBWxa7->~-pb`3Tmڳy]wi=wwn|{{Rz½Ě6vK׽c HEm]0&;﹗vﭛsW-߶.xsGJJ#~ jPm+|- mr?c eWcVqdޅOFYDwd=o5%Gigw̉2DքpE{^ۻ(n<1n>P}f V|,.\wh#{ȿXurz]Պ{{;UAiro݇ӼmG TJ!z.k3|7_Q7zvsVI/S}EHiUc9 QƧ퇛L=1A1()5y &t=/@\%'c<SX9D{r%6*TYb }R$Vr > T*n:\WyK0ȋsjQƣ/K-2܏. 1=^̰*0y[PUNfIpϋ``<ێ.y0-'tv+:x)/9);4SJ;x3{ Iյ؇mG:P֍Wd$oLBEpY/&!Uv&a+MB(O QU .*j}pCuj! BXpea֟ W @zٌq*u)ܼU{pex+,7B\Zx\VI|ǕXG7;3er=WUK 2pmS 3nʕ\ڇv]wUqŘ%x+̵7BB+T+mq*pB\qΈ S3rWV˦ UWmĕz .[4W(xc]Z*Iq*Qp%\x+l P-MUq%⑖d&#Ϙ.Etk2vXE-eTk'KBms5~,U6}V uӮQ?x{+T?*pB\)l?F޾pr-# 6U*᪅,ʨ'X+oprWf+qe+V#>\ءzr'5U-4̺5pe;\m[T*GJ)oprWVSt\JN:\WLZf}\\}W-n&FGBz+kWrѦ Ur᪅JJug r PnYW(c]\E/BRvbmĕ53@R\J|54?TR);\=\qR9#j 6'ZWZ*y:{;\mSXa|y+{|Uʆ9W2nGBopr75a;\=QOW XH P? D5WW-ĕ$+L?5\WRu`+q%(/V ?ᮌXyI&ßj4_E>9 jEoо@GTi6+Zx+kWX?*&ۈ+CP>YW X>•A+TEq*;몕T(peŸAO+Pk i:,UDuz:`-LV*lJY.hh \WV=e\ P J|Btj!SSp W(S(m:P%cZ+#\`ε7B+T{\Z+Y$ Kn=zj+PpB\oK7|:fWNo.bۗsլ,Ev5Ĩi 3eƛRwәm4 p-ǃEқLTkIq;ڈ+-,a>`Pf~*t\JC;\WFrJ}rA0#\\m3*pB\YW X( PѾ J9OWbճQo8^Kb^9TK>tlk]RɚБpmS%)c k P t\J;\WLYbGOՒq{lq.W\3ƬGey+T++TE ]sAl?cW(gP`l:P%gZ+\PjM?U\g|r&ãj4_mV /--hco x+7B\)HTUJOq8 X3 Pf,ժ[W4>=2FYL1x6rWV Ul#,yCXLH_h|Ox~KCC xs7ge 5hc'|[|:EI+˭NM*O}Vr7"G< Ȧ'hKeԘWhMQ,Nb#INw)]daq&ڥHr]Dv,Y%Nd8fn~% W}Mn^E&eBh/>nf$A?@u"ؙ(H9ex_/gՓ |:tP?OGK7߅^&(9nW鲂E*c?\>\pmmr=}Տ}FI_Ʋlo6rbEK8έi񲂽$)ݸyz?G9^vOjip]WE YiĽggd**[Inӽys/j4po0~~f˧ݰ^+t!x`ΆEقE&Zk_~f%`NiY7W=dLpZѫl>(IvR0B j=/)ϖ0) /-R ~̀Y4,`KD!:]|>Bڽf`oJ W)G@kfeG=Z#xG]^R2`d\5F\2[,VJE}U䫰ggn<=]oW+0S%NVN?]%k-i Z'֮8Crfp HoVׯ+6XqcZCtXַUrNgX_x|es[hŶBDJ!dØuz$7!JQ@@(3%L76Ae9/'u2sL(RmC)Ip@uʰHY* R6%I""D4% , K^vtA+[E@oB eɁ\CVc:%5M! _"wdW4t[ Ә~1*͕b_O"9gL>?"д*iYrܠXjd|QC}0"CzOJ.(In4u$sMȦZXcwH$.%6cBL1(mYb{H%&"qRiX)>0;KH?堣Pvj,!YN]*/>X[ai/*o<]{ꕓ%͇G7PdsZ2- -;X|M^]shC'PNKS@+A>D\hA#::|]A͉ gGDJNSˤ"2ZU* n- -%h7^/  jQ0>{{3#~s\ʪ? V"Dgb0ɔ-SC"\j@auFC=F4mH0"I0`P c@k hVX@K<Ħ^#IK%jbE*eUaj$8-:>'SWi%|5E9Qr"B$o.BDƏ)K}CTKAɢ?'C.'y5/[{i=ULFvQ80P0aHkZ7RPm,ݤsF`-oa Y0VxSͥER5pʮiqu"ˏ=`,K8uz 9 x`d_M~O*<;;u y1 qBOGe7z HN_5/j]OwV3(UWBoGgӁ_poœypg1ek{i}s[-ճ ߍ܄[ӛn^1(IYJՔedH9]Nf^X.As#F1^~2ٌfܸ"J:dղ 3H A|~Owد4 SQ?cvIdvx\^xMyy~ǻo\^c/~|{w0#oY o1v`\߂4Лm@Z*{=M*\Bh1ܡmښ+W/=>||¦otk( VMtss|52YdM\!tٮM 4 P̑yc&TOӈ0;yN6'``lc%2+ϨC"Rr+5|dOvsdJ'P њ%2q B0jKqX;UXhEo}GG&9U1FHƛD! cz:pcja"ZO.Zȗθm4GO܄ 30k8M=K|aHK(퀗&E-u:DM:t(D&z7TU7)kS-pKM \onpi !B!&?1pЩ0vc$qJ<E.֒*EKMsF8\Afq9xesE9X&3qR-bAlSpWIE46(y™]< %L+ 3cR* )r؅ELm|!)" ;KIj6Tf`c:0w{W }ݰ2[s/S\8rwb3+ ڗKu/ZyKuK!:ƌƞ&9:YR NVVS }kwBQ= Ȯ<^x~,qC VFbzنa:pvr|L}LT2u:yc(1W]b[ٱ%፹zh jz 1W/\ѐkdh8!jcL\n>vs1W/\qDq}u*0,dmU`W >⵻z\DsŐ;cz7jwzpG鹤eaM/eZ0PBǸ8Ywsμy9.,c0*/yAc{0 qngRήWF7`&s!Ci$SA}^-=:M$r.ĒY(D(e<$K}uk9ԝDSKhζ:YLv?Z8HZusi\3ǚ( dvցaU*`w%;*NDwdM%z')JDU`Xژn}vց[Fб\1W/\IYТ; pr'Qᖼ:cE7&dw%'W1yf'^)Cvk . w<Vq1{3aE^ON cbkBuXY>>V|>˗Y. [pmzAR {RrPwL rx3QTC`X6-'n \K 7p=@ ] RG*R(Ho[_ΫŮ!Oΰf51W,e|=G[NТxAx/l^^? v[l.y&$پW#I!Йǯ/N6>5 H> <op :1C) bZ%TH#|9|>Ϙzz"ψ#X@Q=l[%ؓ9TJ)=IL*&JN^ԁKR0ה+l੕C$ CLa4Qڲ(sIe!dP'1҉ gDJNSˤ"2V* Vc  (EH&(Vs=_d8$~s\ʪ? V"D gb0ɔ-SCS-(ÈdHMiX"UHey{Åj$@yGd ~1ﵰ&Vv"xs!`""7~'=Xt KACWae$=Ur=`)=U\   _1E5@)(ԶLnҹoe#0ݖxl7H0j\ Yb<"jv8eÀu"tt~ݸD~9G LnHƣ#jóQwp1&_#N験 $'Wvm?Zq K^ϠTQD^E }0#~+ߍn\̃8)3]K;bl\Dno \ܚtp;!TFIRJ-#Cr2r |(0a%:f4hM@EJꤓeTLN+DGas]~Q~;-ʨ4*@&~/kǑmnAo߄5&⧋˷|#06{,l-H˪H i߄.%tϸ ON7whv DUyϽng6\FBb6f&xMlw,&r~^:[l&Dx( Os~ȼGZ*z-ZXe;u9 (h8g!QO5%8U*,7Nݾ##${OstM(1ԳOV`΃Ja1S3F&T%P vukHe2AE@wq#i{O:ԭq]%̲fY-u:$\MՁb>&΅NќrU+e(8hiz1D*BI; g{fɑ&y|?epe^P}:8%`aP9`wn6%4j75#T2 P"@yR˄/. MɬX2K!MJf^AsZ&wW5YGه87^DՇmXԤ VFߝ*YJɞ"$o!2\u9  ?B b & S Vq^Y^z@ 3+BX!ԝExй٨?7RP-m!-ȡk! Z30:ـDꦊfWV%elK{URY67 8Uk[x\)##f3.b@ R)8`&5 o[gc9|G&W3?o#$0*k(%\ EؑD3AQ  ePs[_t?z䌨p䃏o"jxsk:-_7h!_]6u&@u~!2K$vwuHu[1=sù<əs0U^k7 4I) Jl9@nTI9ڱQVW]:Ϗ569e u*D*UE`ʸrT2Ūm T>'! i]f:F^TA(Ɯ\C }{Lj $5kg9Y(LJk9jesJD,Ds,R1KkdV0*UZNL ڇlPm$fk}3jʖZslȇ*je )'m(՘0-*D1a bZjЪ$r+9%C vn~5 K:1d] 8t:`amQSd Lu(ιaa<6CAcE0*GC+Bc7*J DA a 9yy@ȯ YSѦ8 B`V%9'cΖŸBJ(6$o7'EݫJrTFSTRmՈj#59K "ܤu3v M8$dSv%GZG7H 6O ҊJS23ݜqk'5͚ BѪR:F5EĆ-5 Vܔ3UEL4 XL/yJX8 :j-A%ckU3\/$@^N0[L. HJڬG6嬢ۈE (DjCI*MA*FE BԒ)eJXAcS=!nAJs REJU@2k8"}M3 O 0W嶻+]zskަ퍍,7ȊbDUtbM[d1p'G?"o V^1 `\8ڔ1p xT'#\l ᦳ^us h*ƛ!@ҪܙU|`iaEYVMFXTA-\Zc䋖@+Bj)3"`0ֳrOQ&`ُ*PqLZpk7' xZbt~VM5BUri." K.ȎYYt&/dJӡ N P\- kBwY~r̶,"BfRtpzԫqVgqjurCp!.<]? 1xq ᯿^yWI硧mY 16_*>@z|S>W3N?)Sg_\ )_bq5mŏV\}%ntg(DžD ҴIvuUTTAպuݥ.g(C}Ϫf:5VuJd?VWKJr;4yFslR1h9߿øcM֚t0:Nt0:Nt0:Nt0:Nt0:Nt0:Nt0:Nt0:Nt0:Nt0:Nt0:Nt0:|`rV:\VՁ c>ZlUF j- 5Į!-D۶Eŋf#%bIúhuHeI r=>ujma>ݚvp#ZV+s xY_dR$-;q;E/Y@aMaf~ ȑ~]HOu~ҏBQH? G!(~ҏBQH? G!(~ҏBQH? G!(~ҏBQH? G!(~ҏBQH? G!(~ҏBQH? BzBj1 9`{o]쫫sQ]%VWـd>{ߏ{omql,=4S ELHC2cG,'koށ=}+oSʷW\;4xۺKye}_nۺ|H4>5guoFVDLN-Aْ[̆KZ/crU\-!:89g7_^  tI?( }Wgu~#oݻNL˕;WH|zz|N'M)BqB[0,Kdompݳt}zg!ڠ`d){isrVzUL?5{ўb&4~ Qb$M$!OƸag6dZORkBȔ&#S&順#HS=~98755_{ِ]/ ɯfl+"dgZ49D}NEze*þsͯ]r9^jwijQo}NkhI/Ճg}USy0yp| x@xqzOp$Iz!cr YNҏiPT]ܞŭoG 5u+[v~Z*u',?덅feZS-Aߞ~H-rvO[} <Ў߯+F@G{Pj )<ݹ2olwEr?6W~X_x/-@\z=B`j˟_;=~=V'0$?p1f1<XeZ(hkg͙/r"O5 rScE!73YwzuuT_#~զ;qԺknann\)/OOߟ<珿'{ĽyG:N֨S"8ugq7x{{<:h<-՜>UmKsy K_r[DtW-we<^m}כ}"knJ߽N2m~b7{C;kTE+uB"0ćuuN1HWfmbڎU}v7Uf(E*Fr%fr(?{ƭ /I2W}6JN8jpxL4/ocwq8$E ~%h z#H[O۩saN(O?D#\./=a40vTqЊ2CGG9+ׇ!l] u u8x&w#pwta=7r5p)7+#a,=#Q3e\=iM;om`5X a; ʸHbV޽5aж˿rœ3IX`onNi^ʿHʿ&]e^rb^Nˉy91/'ļrb^Nˉy91/'ļrb^Nˉy91/'ļrb^Nˉy91/'ļrb^Nˉy91/'ļrb^Nˉy91/?]e.2ל2K0/L?xeRļ]r:tq}-ΧUm&ƣa/ȬI(reV)0@Ij™܀x#oCzۆ|sy&`}$- I|n9FCvG 6ζޖm)S.x :\9vsa|L;cx/~ |w|,:s޺mC_e_G}r4Ƽ7a y- U>vڂzB ނQ >z巊yj){_|,FPo?Aw*]k|3tyNV<(3q/2Ϙ13bH$[,"Yvz`JKRX]bcYkޭ+=}`]Ry5:Npì/&ۃңTH,?~;x7!pKC"pH!8$DC"pH!8$DC"pH!8$DC"pH!8$DC"pH!8$DC"pH!8<G뉾3eKm?5]H^8\?9P;9չNCEnd7!cMt&@Z!<H)ex|Қwfig@!^g 1 `t ~3UB%ctۧXfM S0  ࣆʸOYplsEظdnqaG688+kx&]}N+·n-][ͩ#ѵ3%ܚtos),qvեap/fƹZ֗fI+o;'e QDx;J܏*(v@FEmoE<`W^s/Vo{7+>&Ѿ q4A"uk0Gԭ 4 4rk4֝k!_A@lliOq\?^{hk ([E'[D-6wsP{!ؘIO3,& d` %iIřԯp{Kdr-Ez ~/-;hl~fGݤVjRGki/%v'!B`0nF6xai/N~'{{Sq9`GnNs#"vPݿ!Ig*QRq/{^IY?zVW,LUu( %A`L%\JP2J+|P2JRP%q熧t;ͥ7R\ʛKU?5x4%R\ʛKys)o.ͥ7FV"ͥ7R\ʛKys)o.%P 䊠䧳(8nfթ]e<㌰ՙ4'ƒ &.qԿҷI45H0 J0f`fĹ9%6x$(9"9(M'Rݹ kx7{303D#~ kBYqc=WZ$yǝ2??`I:߆I C|)Nb[2ݿ߳!2T&|K<:(b:+d1y!f3S`)"gCt^ J1gyQ+rV^&Ϧs#j.-ʯ73q萮\Zuweǫƫ̼hC?/a^I/A0kWå{2A5℞{El,:1"~pgPgܲcP7Vu/A7_u_?ŋo.0Q/|y|=06E}7ܪ_EղiTU f'ޤ^fk=}kzvlm-':^~w_^b7&x~Ći~UvQJU^҄(uqY Y16Hhxg(#!tStq F܀rv_m}!Uν:,!(ROee}$-90' 'Ps538 B0 gqX;UXhEONݡ#SSCx܄ ap08`Σ$Fib̔s3FiM;om`5X a; ʸHbV޽2=59Gma%䈧f;൧V\q[bňՠ_Es⑫jGgX)#џeqq$@rbRD5U:3~]}|xKksߥ;tJ`o0s6d:BrQDSr}Uʺ,_VM۵F[5u<-JE4C9v(.hƤ8a4{%˪[(6b/[hӘtWpǀς-[UhHXW-FƴGhh!u+x-Z~"&Mp$sc̲l<B I1Y @"iɡc"9'q` 35[5GK>k >NZ4y)KqD ]d~.=ߣ?ߣߟDK*wrfi3 \j2sJL`D$5w;դ[FDAXa &a3BS-rLP\j) 99GcKεcycpۥǒ*9Қ)vRR%< ;KI-i5qf8<$nOԿY,qhv}_D3<ّY[헯}ISA>|m9lwMYAhH b iCB:GT}E1N?{ȍ俊0.~ 08N`\IaO[Yr$y9wbdےcd*LVwfl72TWّv-+窘b356c㎡:oR1 -atiIZ(nq]\C7ł9 ˭."1bePy[\u]~9l{ ZGK!zƌƁZ0|0Q^1Tk%iymF{`sH.9'mM;>W?/tlf_?V+|zGzDhZzO Yla1^U|Ku?]ѣa$"_qٖ3{d9f&`R/ ,Zxd!RԔ1тFQ4`pH~4fn\$":5wY?yyluYk3ĭnNcF1ZleF͝QFGR |tȴ:}v~2`)~d6f pѶ^FQ2 6/\ٲv6PixSF1 )`)$$dg`D;>[DOC"MCPז#}T"ȸEFI`8jK9ŭ(tvٍ6)+6JC<r>MZr qMv8ct/8a \3::: ڏN;#|8r?%\ҥx~?{Y`B<!x0k5f,`ZFL&Z0(%5,홻HqCYߍlK[WB%\IXqvۨlgK펦72~nzt7-۴+1*${[֦Sq1%lԕn|Ge{g#AnfmAFަ{5>O 'C;?oއ j:XaG3| Q)!z3ۮUR,fjlpw?ѿ&!ft KnӢ Jтd k,)PV8dV6%m{m%`JR1G EmrȰ, Ƹ#aREbHrt|ADa wb! "=3%~FCxIܴMl4KM O鶕čnKc >GrpyU,%a,n4i4 QNuzU+UțW?•o}R_0%80~.}NׇnG;_X:N(G&^z68P륜+p Qj@B =BZ ,,oI: =VQϊ |5G^poYCR7̴^~.o)^ٝJGD*?3'} ]`+@%ߕ +,x{#cХF?z|sle~V?=gQAiY&v͕Jt!Oux`]| -,Q;7 qf%p3roDʵǍYo+]"gUOU95YJFHq|lGm7 ,3A [8dS#JVE\i3ekUtŸ(w,>OP fh/#Et$LEt!"'U:` z /H"88fSa1 A0 ('`! =rl4 {b2eɘV!!: $H[XA`Ir,a#ֱ6di%-KZ.%%XlY׷i=wsQbŴ`+ ;$%9:Y\cHZyk+7C=}7_s7}(@^ik8c{RѢǂ M0&sKM˾Xw ZFZqO |96lI!fH %F7g)Oc.*5X@UGa| fB(iWOa*wc؟8GlN/7 y<%7%pSpӰ7Ҳ O_zoK-|?,S_ͯѧ'#he/zjs7p+g_+L5o GB[Si\l raVC iMFc 1NJn[]KBKV)bxЄkB@WKaKo|QV aUU;ae&jXJ3:"WkrY{8w 3 :  -,% }R-A'| yBvCj-Ž5ߍ4[^y& .EżP@;l`WRY._",Yydc|LNDR%nŜ(asy8A'GSVΝg7x g>!c=\&&VQR1q<@cLWL#,8ZI}#=^Gy ;'`;>W?/tlf_?Oln>C#A=j=%Oqx9ىdO?$#U$"_qUe?_ 033z!U`J# be}00+Cʘtk~9aſ?κ"*;]uګ~uh5Z`(3*h2:ڀmT dC$=1g詐alrw ҫ:,Zǫӣێe(FHZ7̛?*&)B8 ". ieFf;f 'CF$͘]܉arht@VI1b"CRN5QrA(hiG5\Sr]'k7F ¥x0!S&pZUARk ˾ Gg-6=LB>\ۑFmrL). 6nΆEU>uKHsOaZa]4OhW0Ea`#kRq0*$9YAJT9$ Å ')}BcE>Hwz&RrHd 3âҌ;$$" (O u{< ;Kmѯ OǣഄU[Xa*DD+ӄ@"c-RvH!&PA Q #|ϵ4#H0Z J0fA} K F)qUTsD ( R˂ ,spC8 #o);].&%ܳ\-DKSY^.0Iz35Cz( θ3CN{ wEކ1_T{]`@rN!\ܡk)(]$EbvoaYi\5!!Bp黫 +.>m~&W_-T9L[ӽf?Kw՝oo7 VY`b1[?ME[]q {npAAa5$ecOt iF7w,A}JLTL+h82'dg?q "G6RLVFHc4v KK|{jK2Qmeɣ`[O[$+ Qn,U_U9W\׵96q7&[6J!۠LnL )&m A\Dw}مN‚H 4D),rJ"JERNq8ikWVqҢBzJ&DG.ȡ`*8Ò0t\p@I>񺹈C\[t't${lnŸ6.w%?_:p^r#ILa S띱Vc&ye4zl5BZ" s x(o)g(vI6:@Wn.ȕ}wm$0T* by(Ct^^]= gxI;vhBJA_SÕ' 5j)RoWoN{qcӉ Օ?PqMpn k?.RkzZfkNr:hkQ9k=z spO3n5.9}<;½S)ޏ" E5\m}۫w= lF;w_nFͨT1-䢤L V2RJbVEvtZGj vvUB9Yaz?MgzN%|Zk^Wl]&5HR%w:Wj Qqlsm#Xi5emtӇl ,`4?~q2z)Ĝ[@ÄH%sr]tD8bD,P8sRXKLjHKxIlh(g$-}yh/SCo4˳Jls9^Uh!I36d_<*#s)),o$MMi_1>|=)ٲ=܁ gdAC ZH$7nyi%17Y|1_J|JH3)(Jx+܈fMnh.Z3)w9d#]:M3Vv鞼wnu>ߗ;ʏJݮҖFt wDA.^7BZui]~Zui]~Zui]~ZuٮOabk!ɾc.z0WLks.`5i"IQN)hkcܛaL7vy&: LѭEE y }Ynw!,sQwi ii?aD\ rŹ:QKSwNTֹ{tB:7Wc.0Y/9R,Bdsk,ɓ!tk낪GG|rf^=9Îr 8M(rn5;:2,"1rbrX+B"Ikmh-2nخm rB\Z33? ~υgq3I`Raz`cNn.͸tVnLԊ8r5pn>6~l\1gq{I9!n6ސKıÊGrK"YRnlkA|SOΧn;A9):LϾ{? { ]Vxyڷ5^1i[Sj`.hA@eQ[Yy@Uv|6`řmȃp1(pPfͳV, ogeqhz`;,<_3-#G "c(:`f/cѬoPI~:qk@nn:ڟU420G J^B:`^e.DjB,hWoH"88fSa1 A0 ('` )2Gd1MH؝-VZdL+ jbKFb$S, A9p0I̥{ %J6%%l B:y`L:mOt!1'A _*Jn},+\8s" |1,F;L}N-Ebӳ*Cϗςx LĪ_W'J:.%) ӕKǯ׻&V(^%2W;rl`!8-(T+F2 /T!`,Q%g2XAJT> sI^뎊tPO0RFǤ"9̰4N+!ʓ=I'Acd(`܋ UD}y͠>-au؟:  4,)X`05*(!pJ\ #`,Q+H Ҳv2sﻖ1|E\7d};?MإcM(+ӝQ뫭">_ì'0IbfJ07ƅODPٿ?Gw\oʽ2#SI;*?"yق1tY; 9F"r|/Swl"\bJ8tu^6R0 8+: `5&$H.*OnZ'0 w^n.ްa2:|*鐓91"G Fl[]_NOO7'r5℞ EʇF)hr\}] I|p>RU0!S]:mFܮz||4g,&\O^v..z:3Ej7a& 3в\tYfyC_ꨔs?(]L\w+:"UV:VT "292{#~q֯!T)+*MI9*י\;?W?{˫??Dۋêap"e$p1 ?oFۀ;M7hZ647n*F5zi6%-^7Y[KCfyϿ.9}~1K?ijUI++@6uu\Ɓf%U_:]ܵ/U u@`پgk@1VHhb#%8cJSVq FVw<;~y4H)ʝ-2%P?1~Fgs@l{#7\1aoX{SXhEYh]s^CVVY1fA}p뱲209OnH[a1S3F!T#PEn6dߥc7F㉹Ȏ#I}_GQke:Y/\Gǫ RǫN'{0`o+1E{HS63ˇnIy;kHE>`N)VPH*Fޟ| {wj3|@m@RX*%Ki AI9i~;mJ+$T΂#"᧠oԾf<2/nʛvE M3A}Lp5(*浤ؑ`븒ruƒE*OySQ`HJd 1%0lix1ry8?% ~E풤ӽ|fm弐{>LN] o6u: QhH b] iKB 8#Hr'X i۷ CwR<[rl [e<.s(vҹ߶͞){:08.@ş> }=M*{JanǦmF :[Ӭb)ܹ<"Zk̕ ɂ2٥}WOj>םy^XIy^o5k,V~гP1t50\W~ȳe0Q^U6<#lH{U%OW[^ ]' _D~yhnMВ6z8ީUt_9֖8 x򶷍7&±DXC6\+~[;.U='BxN).1!uŇ;xܴ;3m#sg,=/dEǸ/hR<`s-:~FR:`Ftac&&VQ`o($<3@cĜE NVF߆m6㾁#z}g;kT}M%dV۬++|ٹ'N8yjHxOxg@f &bkeWrCQyvTVIEΎ“Wx?0x&\7#Xq." f*}fj]jhd#*1O䢤L V`t0K9+ש9r&`VYAA ,_}ZGVkG5ȓay6n`Hv~foG5*?߼+UDfF G^Q)Sh {uVC*;2(x$qܘ}RSX'[ La'w,<W/gWuۄIjk`&r'mWc]aLMA:A4:mm˔GGw;A}_,SEZVĶHs |L6c[ǵ`4zZJ*I>Zr`OOx&>aR/1",DA/5eDDL ` Xy$RDhgꣵ3mƶMb atǯ\hV6:ƻu>7MRzsf-޿o6 wh5Z`(3*h2:ڀmT G?_cNч8 baukJ0Umv0VTS}U9c>|7!ƤJ.iZ?:ͪLoEyjJ;tIJH&jTcoh%Ϭ*- 봈&Nz/ p,xvI('E!Iѿ]+z7- ZCÂcδ[1uQ_?J且lkg jk~"\٫PTe!*K@>ƿe ;` lNYKU 6l:%i0@ߵ 3nlDmiM]C66 Nd7;uJvmӚ6\6@ 薶KjWĜxK55CCű&`e M ;^h3̢#4lP3vFmy[6B1IwR nKfr9IXN7 *0%{k:Ioh-ױ ` 5F?MЭwlgз+m' a0wgw෦פkCV4-:;wY[O}~=/8k=X3)ommRهf)ay=IJ[vJY MpʻAiӂ 5{대]9P&p\7~թmFT} Eow^ +`4tz\]. zЧ1eN{{;pV|usU:|kF&1\.nւ|}iߘ aUwZpR9lKi0a2(,͒$ƒ;BlrvY9z.-]B*mCSý*,_zy X>]ecufDV ~9Sr+f6jv7+owƓa1*\ٛ*uaR,+ջ.3~0Ηz7|Ke83W'oBӾ;6Bk$ X0^26_HTFZ-Q/]nURɍДTYO{pw~?~9Eum=o'}Wn|_Tx]y  ?͵$ Ýq_*Z.@w1cvXH&1B#*SI+ε JEƍ)U9aK%RS*gt) It` y K)/Fx4 <ޑ8ʰѠR "qrf- #p[eG AGQg}PH D(K Z";9 pJSQ[c,JÃWMepEDVA c6 <Ncé׸xǩK|b790TW @a&r]u&7YZjlbR20N4(YT@Q` 0=(#%A@ %85EYfGj"J"`9@*%Q2 />zt^9Ra9 ƔXA@Ո분wtXzE AgAR "wpK&H$ d̂C5'c8aӈe6r0@)y@e/6iQ .ΜR@ FZJģwNb GC9;h*`C^%9:;onVu1{n4+󞻛>0?{M9:[3cw3_gU0zE=IϋuR+FR AT]dUt/t9naogEm31}c];KκX?~}څcOh n>^Ho=O~O xS;ĒkaP dOc.VFݻ tax+̬sy-|Nկ痘WF@NŤ߯_nBK/ YJM(1<ɟ(I>e @. |IQjo~->O'A?znDcc ?@vUݿa[UTúڷ55kmYxz P.etLƽ~ү(*}>7ٙ"1 PdE gI~5Ź׻' 9KnŐ(EAED6ƒ< Y;!}-pONWkMNAӣ@8M(rneZGEdQ0brX+"D4N;P[EV8UF ?!gfY§19N0=Y(>f7Z 2Y &]IJZm47hl^EߣmLEުTRW:HJo*`WןGSkğJ\zDQږ|~}Š;TԮY_@ҽkG>Zf|)L.iizOTWtNScuS*ImU-%mcg mn8H6x?ަ_Lee@Iq6[ҚX{#Rrt0^,SHSJ7rrFDq^ZZ;y͈HKۼL^ˬ4l+xwЉ.Je i1BtȿsŤ'_’ң>n3!fjQVfӞ^v=BwHiHn":eg$C0!IoVX-zg՘IDk1hFs+% v͉LtN/'jK ^Ȍ ^`A Ɯ4皐_kGꚯ(xEed{ÎGn~@n~o4x@uhdJ(D%$@tY$0R:" bv|_E.qVq@)a( rY D{@ Ҙ&(ƾ  #64(3uS/N Ij$fXǚ5nَ:J(MJ;i k(cU&I"CRN5QrAQY-78hD0/+h"=u:%KM)8FҎY Cʃp-A[/5^#>ll9;F^ҫr_Vs)%ŠQ:ju0!,{ἮP K!a+?*{XJ ÙrIރ5|#rSfb$fJ03.2`Fٿg].;𤋮IW.Gc -">[1&krp9V` $"a:RLݹ­ߟoC_{`#Ӱ5!?rT~IΗ =wU:h뫄EOJϡTUC.>Δd %٧x%'],_ ׈z1NO^+_m=YqNʇW U*-gitV=܌t\z2^x,,拹5{Eo*R풣Pa2.ܘb~(^1$ڙv4d4z4чbP򨔥;Yn?h1g7rJeͳ^rݬX ^Q:e3Is@ . =Y_s cG|, AS` W|o}~{}w|{+L՛zs uoKhaeSS|b;L-Po=.&_yͼgZ4h{n.[K}ptٻ߶#d? hw8"O[,)fIJdQmʒ '&p33}pi'-:=̚Jz ?5nbV">|&Dx 9>cpd#-q&8Uc,J_i2ekawYYT`#)ET};uy rE GCfVZpJQ#ecNae!;uV!9sZc10 [-#i Q"mOr1czY÷szj`1A;E@wu.7P${֖.EUmdʿB|ī xgK50[&ߊQjusc̱r:򰴗;kH)EN)V0H9m!85<)*z8J\߇'Vc*͐@sWt|Z{J]*$TɂĔ#"Asԡ[=<#M%hh=MQaBS-LP#\j)y-,v$j=:\~ӟd SgDT-)1%.DpCLDl,,xGI4 Gz<&m8QvW؛K,cׯ/"W?y]71hB3$%FJBIXYQ D`4QZp LV^EooAwS6^b-KVյ%ٿ9#FhoM*8gCRqXt_K{e,!]Znu O\ ʠ6bƝtǽ)đMLryå O=cF@-8y>N(RFXp4G~asĽ >˥kl1A1rDn[v;IM714u~YV{ǣ]M](+L̻X(LiwiA^̑eGW%FjRWL?_ݕ#afn,B0G"`!KM-a$EV 1J'4v;Qoc\nH9e)7z[+9M\RJ/mPqx쓝D}#j F= dc4 r@\[0Q"Q".pԖr- YcJV}X> qаHt1HN'u|%\ur0#:jKDD[pO_ %axR= gg ?xj T8@ē ܻ_0௾2q9~^&s΂/R.SЋZ2(HRzVwKq`VgZ0$H/4L9¹u6]cmsn0¹j jZ:/4?בy{kJ8{Czaz;U=k8 lvd^Fcq?EMi΃b?Owgn|7*^gIXO)U>D Y*)K(+Dow{iۖiRyjfeMgۊ~bZ^z"jܰ/.dhP!L/}1U1y+o::d3͛*7Lg67\ ˙姡]MO!?ofezszzOIܞdUAmw)]8]VnבT ZlMC"8[x=Y/_C/k5sff̊y8\񬳆vէ+VVWVޱNcyybһ^}L;i*&ԓԾan=Ͷ}ظmǷe5vWa1YxtRؐ bj}z6-hgo FEh IlTmrlljY$7v0wtPjyl)o{ǔ(R5DU-844ַx*-LI>IF_E_bMӧ.{sPTW=Nz<|=bN8<?+3{yI7h eOy};ӟxE|m<(K-eTѳ )FEϹo{ Lo'1_~-q;1鍊f+3Xh,Ƿ7Ig8^r}_]~`.F0# \5gr] x?&z.+R[2EeZ1;˜Qƞxi~sFKG^7Zar=YD>~,SY_Au9㸫/C04bC1'aTL]B4M(N \t}j4>Z $o-)cc߳u'Z on[(&%֯Ǔmv L?fFXU!Kl8=@,/aQBƒEiC ŀd:NN7&(Pʍ"68Ȱ, Ƹ#aREbD h$))F 滻0l:QiӴ\3(iFwJ+< :f +< um~SbhD>dx H&wGէxC.Iq82U2D5 %aeNh;l-Ej +A)ታ>(BI M~{Cg{XrBچbb2%fcuFcB^9%4 ʕ ?-3 w3{LS \Lj,ba$zxahu^t"`fp wY9 ,lrbA$'0ڞL:$jK ^Ȍ ^`A Ɯ: =md:Z;n+\M A56D+HQk %.P!`V0/2FJGt`ZD"rRaV\l,4,r!c&S:$( rڐh#c}9AӇ= "e:tL+Pg($^2"YobA% 9Ij$fXбlִiYi'a{PN"{bVsR-U]F3U+[6!v5Qw=0p,sq̻X6ߥ)rCașG9Kq@Ȁ|i(YT(0y)Jx0!"Hl9OSog[C˔yΗvzߥ>Xׯ 6iqXk¸kzfxuQQOR3nVh@1gN-Xhm 4wFm6*GT KIw!ӳ&v ^ހ'رpd+4nr(Wd@Z?ӿ-Z6~/#1LT;] cPĥG2,҈GLLeDy f #_/ b Gwc=ꈴ<VJrʅ@ `BKMW<r%:i] %Ԁ A2 Vi,`VZ´okY7qvthpܾ>]FM wzuQNF&өcRQϲtGs hCq[aP^ˀS``EPBuV -ܢsI cnx[L:ô5HwL4:&(fEwZI%hD0AP4@2}QT} ܋0 SV}heeS`V!"Z!&1kV#5Z  @%G-(> ~hiE-ŀ٨$ c@k4+M D0q)q1h#bDYzV.WK~u~kHs C1s#.麏X  }j!jL0-k $%)~5*5JP~b0UL{G1׉GgLWGH)D07m;+ /k&ymqM^Ѵw9c>]dɕ4Nj%JkieKbf8Cr8bxmx<!x~:)uY:.*,73p Y0$1feh|Sw9\e$S,f_TM+$dQN8Ftu!M+gS?/abGgsU&?d4m͓~-uo~*^|>x-,fA>~*qK " I'3!/Bu#IY;G:Y$>  &,|8_U&+&,R;*AG&Y7j\APHs8r1ȯ% >ٰ< %RTMQv3׻?{Oog?| uWgz wIp)#jm o?z{e[CxTC 欧o2.ms5f-^=7[ CˁtUa%C Vq/)]I\|櫯eɜ8P&T}!B)u~Y:l ,+måNT>1c(NicX=6ƾПAF=iR;ɳf_1>m39AN6{#7aoX{SXhEY茺T9/X>zH<'V1c ̂ce%e`$%g{7JҰĘ)gCq\^lI]#5jޑS\2&1odOEy6ƤSzx+Jd^5wL.ytχB`{5oN `&fA%g\ȓ˵!,(  ND1n+KoV8mo K0=B(kB.0Z\3ruT: FϯO7a9ZIEv RNig PL}U=*~w;_B^1F|j>qnz,L᤺b:?_~!N~2,ώ7e!kz{0ÿ3/mcA$)6b5~fa+fz9Y|43OO3?,,ٺI T7Jd:/;o.4!NQ1[2^r# $g\Kr]Pۊn Q(C>ON܃o&5[ig0>Hjwt-ҧѷ?m TQٹBB,[J (ˈH)h;U)i))&8[^y&5(*浤f#Q&q%B{Kp>ųG'Hk^JJd 1bc1g`R Ƶ]5pA&{Vlru^UˆW_1|Kj|~7~ c(4CRb.PAVqG4%< O 1o+;Nt6EBM.&k":ap_垯i,%mg<}7+09K<+UdH-(dKvI) }|qTHM^'?H|8 `@~7zk*r9 ہKz;hˀA[:RҹM&dQ,Lb{wa= + NV{ثXea즀A085`r6Eۢ)`,LB$X- |.PkEe"5ja R6>ɥgw1׌Qc^? ?IkB0 T`=fkqĭVIhrOT7^&cyWp~`6[ mNr[]Yy4׍tc)z62V&iA4b!,䫤 ᬥi\dL=Nrg Ц~{trtGSEDmOtv]zhuǚ}(cy0$CUM+]=Mw0ۢ筣6Ra+Q9W )(\4 ?)).4颍$ qo%rJd ,0b53^}g VWEV&D[d$Xe֊[f.ۄM+f=m3^G{;3ФDRAcND[Ra*ht3MF/ƞ?5^GB0\԰yaΒe)uΦ]MAdͽ3nEK4L0H/aR]ŌtQX{O}JM{D 'L9 PbAe&}MPjł>XP\_\.Bh5.xQWOQ\1D\Y,P,{66o_g>g)H ~( 85Q$i^>6 gzsk2ba2Ycʣ" *LP!y6ܻߦ*MĖk׋fs۬[ݮl]-r# {cY$R*; 0.EPT-&(iw)ɂ5?q= 6AѾ+I(BWOP\)9:$qf+"+СT.;qdĕ&C:L`3dS$(SWZ(2+&|}/o~|sR>^d"ZkU bA* lec|-$r|qi+&*H 6r q=\ >]Za* ţee3$Yge/,eq޿@6K%}x??zm&oɭHlȃy(L%%6hQx-!xd`֎k.gfRlB*d;\襷6(30|k+H;I 7 $#ƕUxDeʹ6AȸQX E\sFN-q!o/휔Ɨۓ Za㜳onS#cǭQ- Bm=Yu|OZQhFHٜQs=˝ n$plI_GCffi^mϽiRor=$DR=,'Ώ܁*isH$S,6 Hh lĄ15CZͽ-C0!I' eZIDk1FSriXhm;2* Ք -f__\N"jp\ZMkt83Җ7Oyo^nO5]Ow]/&Ev!]MN5r%h IVwzs1e)[}!9.Ze:5ڬN,:9̧ڼUΧJuZtɤ97 'k9|Mބf6%6\u4WMWw>kL\~}_n7Qce۲ ;lMʩtH'-.dJbDT׭F ";1%NHo|U҈7M#^Cv]1ݥʝ քdKc8:]LsQKnŊ(EAB6ƒE!tb=T;k7p6frJqTQ( Ȣ`;X+\$JGFiJ[jJ7 tH*.cトgC!cd&pt8I9@Gq, f4VR\+<ťЍZn>v#xO9ci*} VC8.W:MQ4T1BZѺbA@H <^O ?ÊGrK"Y0s:%.|fg)q$(Z'T^ BY^ %U@;}w>08g@B$cB_[q<ַot/u鞼|t**SN$֎Do\!i]:bo4.& j!3[H;+A6@ ɩ;*ftGwv.D Ȍ:wA?8<{gű`n^` 9'&jBG zWީkV IM:XSD|0ar{+%Re[5pֳh-g>Λ'ݼ:iU=omWJF(՝JH1N N9ťoW12#XP1g&Ϲ6vDzuՐ]1\&we[1㫱/' (=ъF&x$ yIat$LEt!"'U2Ɂn}DY16S@qu4H`&$XG in9:xV!AA QzHLdu4(@%LR#1sb0zjIƞ_Ӱ܍nfY> 0: f0bZǣ]=}aapdVHnmmdSMcUNvB-#cxR-@,ypg6=zkhTOwX4/ަ\۟.޼p8û730 .i"g` A>Cײk] 樫 ߦ_}(>ZszUiD7#Kp)>vҦ Vq?G>Wlᬌ\t%URE_З*Dxh ϓV6Gm628^2qC픍&eX<``'ic, *0wHsIM~jN/eT9g?1~Fgs@l{#N0jx=U),,tFݡ#~KQz?9=f>XY 20Ё%VixŘ)gnM"Looɾ֮xb1to]A=#O݊:1A5:N@DW߉`Б+F.ǭ$nT.IT;T]%'z%GGmTٗ &c͠Ky$"kY"`kKtm9j!4=+d]?CT.*GOU"]")qJ7Db΂wDðqmco'NPSCѧN6eQ 2yV} _y7d fHJ ʐ4Q0*bƠ$"ւa:mEx+-ԝU)CR->HC Uek .Y:kmgɾI8doM*7ڲ$KUB\qXt_ KBY̙C:<R+ ogX/{^$*8ʝ7\ 3f4Ԃt"iZ+I#7r7Q!ΗDl ؎/߽LɮmmwPHPؠ^wZ?峕ބx3ʊ`09.: ,-56HaVWvDˮR2 @`{i1JUz %<x ,@K\BF!@U`J# be}070+CʘtˮL_rL*"l^bx<ԺZjo|nnWg} 7kM7Z=\Ì_@3JTJpm  z2ݩsUh4BkM̨3hQ9b0XJB`%5r8x*2(%.pԖr-Ĺ8#8ZUHhGrxBx%\ء##oh(_8dX貈qچ]Dk !K[EYP&J|;0Ծ}程$NiѸA>_^&,j3^ld͊N^`nO%hfo[O<(׋rVϠo7G`L6o \ 6#їRȯWfAurIގgU4I?v$Zf1 ݼ0s:<\5 f'˺:;`uBq`X;vjcIKcD q4;jۆK2]= cw;%-4:npX}b'b=I3-hg7LL\xgcq{Er/GlnFZ3C =FGu z i obJ)fl<݆lbgi*-,։6I3b6/<(/ Eݎ%I @.O. ^1%6QqD+Xp #͕*Ae ֢.gK)j7Tk9=)$Etꤏ&X )aujP{P Rw)n.MKqӥRt)n7].MKqYO,JЉ`^L%D./tSVƩRe kE-g\D\8s" bZY*?);HB@xFU:dQ8^|4{j03VcI:ED)\Ě(́ kiG5\Sr]'kG:p))0qZ`b><kY~kl鯛ɘs̔jgai{nT8ǽxHUCMב6 ej-,& Q9nFZ1½y) c"(AR$Dt_hA#2:_umHxMu;=)9IEdRMafXTqXDI$GqcPafhwI >n)XUV%,)X ChP0b` 0u~iA-ŀ٨$ c@k$+- Dp)q1h$VQ1XNZ[FW6g=kqIaM(Sg?3W #Xsig$yD.4.}U!IM+5:=1.*`Ͽ\;+^GEcbL&w}Yɗ"һAK)(LQ)(a/ t^ݫ)`5& x\:$UXOnZ;^$. i"cq>;>3(|.7ή]vcJERb2WΦÛ9qB~{mtYnh\!7$s kU=R%01[St6{<]λƛUu|!Ỷ/Q/fsK.8Foea+3O ^3z'4l4wYd >ƥLVhpWOtqg؟n\UEymmdSMcUNeC-#cxRӷ诘 3xq@FШבּhv;s+`_޽M??]yuqŇwo`f`\|DG&A}~e[]CuT-QWM!7P|ҠZ _ތ/¥AwuYWM t%u=򙯿dg{C"sU%URE_З*Dxh MW_ȢFZF"1f S6a(QכPn=#2_Q6jԿ%eaK=]]՛^wt=5D:<M9ӧ jQABj^#ѥR RiZz)gk&_-@9s#Y04\`D$4z5ɡkJՔy+4+12֜ג "`G6ڃMJ*ET\'S |3-f׫,͕}܆} _28 Qhd:%u2- i(h J"y0(b-VBo߂NY5rO!#K]+/%Kg 7S-b<eS+.峃̹▱m)np09sH[Ga=bʠ:C72:G6A0ɵ r B($<`rŔϊz4‚S:?~F<פnNv}}%@f/?Ov5mnzGzD EW/\2OiaTrmCB:%m:c.CzppfYx2!oNx<5p~<-%|tqё#ZJȇE:whA2W2})s(˃D?qZڗy 0 Ar jQ'HJeS%n[,gY}pC[4oOEÓ}cWO㦩פJȷbY,jfRa3,WT\x~}hrvXaXߏ}|fl"7EnJ!?KV6ڿSCڡoH{sk!kbցA>pMלp H]tC>_|O~6?#,>;XN9OEx5_~z\K*eTnhK2(PR@rn;vb|om/>;W/Vz>Cvkna *kuCS_ue[`VWl|gy57~KՁEE8ګ`v';`CXίkRSaIrzF,_; H ?&.G=3+ՔR[3Eu:K2;˜,^7U>#uuk3>WiVZvJOt8hƶl 1c'WaMm.aQ9VD)  KjY :V'm>9_ܔI)Q1G Em*|#"( -J)摈FR 66^4`nf2*Ӳ\xƤQܲ @Fj%Xk asuO3#}>x$Ш:;hcB@IͧHu"H?ÊGjKY0s H|~D|_bEj +A)ታ>u!P TA j Ν>oBҵxmGLS[K&m VBw[wɇ@@LmTLb|z̴hNț?\fqMzkr53akmA Ý:K>d;VqʟN 쇅ϛUTp_0j78!tHx8'2Qp@"F!3*͙7sۄ! - Q;7 qfAn$*S˽) a.rhh}.sNC6>9:'ڗ/IgupU+t%GkbZ!8>jK ^Ȍ ^`A ƜoŔzBqu:JtL;ln׷\}|NO'Ͳwhd@aD%TH'Kґ0хT9tbeߐE.qVq`1 @A0 ('ֆ*#&Ҙ!Vd10B8XG!1z ",iP%LR#1s:C\EҊISZFőA}fe@v`#[< `Φ:H"WsbDO:HRVaB;#ulU"sQWZNN]]%*.+ΐwz9NQګ̼RhM-0p)Ӱ*kԟ z]+nq]x@o=_c0?m_BL+.db }1qMjx{cqu%|@l|\}MD_|(a3+aO {7.ZvfA*2Qbq6V6-?lZ~5\u) Z̧SqPkl-q5?n[1{aRA-|5Vv=Km}s|\Yˆ^FcD2Y+냉KM-a$Ek$RDnC~{u9;mF;vkz.aY$xˊ9IE*L< SDc(Q"! %4g$#9&" 8/pv@4㳵#(u5of73WDMӑ|D2`<`n# VW;P|0 +ItQO2a'pfp"3Y:.t?pn2,;%-/1vdmǑTR w%VyIGӛ#椾h{v6kyxObmf?pd" lGg=`?LxSFӱJ`$85;n2 );y҆ND@'}=}+q2T,@yS()' o&I;.)Ha)WL MT@lJ3;Hs%zEkYnFSX90j>p]iw-8[ͼ1r7?,??/='37|sbk5tmlr֓lo.Rݹx5|O\.kJwY'}sgf:R vEurΚ |Y{*=9_QFZ9jm <9LVVmh:tq<S؉1r[( c[oC5'SYq$⇪mXV wMЫͿfﳁӏDvTw{vNGJiL-0d V\ȶ~@\>s"FHu4 4e~|l_(WcXhPm)S5AZP3kfy.X4| R [H#A`2(% <=rt^Y\ڠ\LCZ[AjPp')Aϵ8>λu`jj!Ӆj%} K۟b':*OFlfxE؏&sk#h?Y$# ɤC#Ɓ?ʔ9R!3*͙7sۄ! - Q;7 qfAn$*S˽) a.rhh}.-N!1:JbyzYeJ%Z3 |VN9ťoW12#XP1g&p9,[1EP-oֲN>Z x-W_n@mI,:+PQkI* = fh/#`{t$LEt!"'U}/zX7$a iU3`L1h L> 'H !JG=rɃ4&=p+2dL+(QHLd$&A:łKsg H吱s"iEƔ"cpPY^3lq` Y-&;KWn Dlؾ E7g:RLࣿfrtA.MEx\O/Vq] J  -eQ,ݥ6n嬵Y|o{|;P4|ԡ Up1 ?YTnauTvw}~skҫnO_[EEVБ^̚N~T?~j\/Lz*BV0vF*v~Xz߀^^T=^:V'YwҐ0 ̉ӵp< E\YĴTH#pHxSS0LF 6Kk6M361c8qyl M>+^DQd(Q)&J.(s6*KMW<r%:i]HX7 \JlB Oc*5ʀ><6'Kȳ"90k'>֯J 7TZou"bGAt~j 2Zh\yvf-m"q00 ՊkeRUu'' `Y%P$x433_Vź-Fn!QM>X$N,X!gTmbT7`o9$<9Y MNk*%OR}W ]#;V?h4=HG',??}90T #AG ?CۮC; mD^0.BS^3U櫤QZ|~Qhj~HNY]ȿUU(OlVttEPS+|t_DB0,1KoBu :7w&:̺c[-SMQ^qSkbԗz:%z6շ^& l,±%*nY5Ki6@OvKٽjibPF`8a\BcP"s6Հ^"UGu27QhRy,t joB9")d+Ql|[AH 38gaPDFCCo)񄢔+lȨXt"#4f/T6<yx zR)TWS1'4RHf=!F̉YX S)huiHy P0+D YdQ%20ș|Ypl.H)eB JBĀ| /0 1(($ƦB$T7Y< 52Pa +~4dM9Fr@hTC!%07(IJH%BQ/!gaP`R  k-S b"x w'Ecw=vIyYz4K!K ~&j6>ЬaD@MC.M= 1Xn/ ꯶>yeۻ߮åmpeU/.fv)k:p[[ؚ%m 3RؗT*ՇŰ's WyG' =d3N#}؁GT{"61DՈߣI"=M՛stYNb&؁Rʢhm`1:)d9+ޱ{cwY6ׂ`gM"Dvi٠l0%/vF3 Ti6'v0`cZ;g\˷Lӄݜ\Si\LYg$]{=|} Y4y{P:!F&_ywNh6,ZopO$( N>Zw]soO}Ծu<\("rE'=߁bN!F/I"rH(Y5#Q{וJRu<4$xfj{`ce&c&a@)B`k osNT0Gk!ePN`Bab;JJQ|"]Z֟2d"5Y!:GkK{s%wR0xN[רɃzh~4]4iâ=YU-Ž[TRTsz[6:ʷWzъN-q\b>q.M%rJ`)i H16R][ȹ2ތR6b ;Bӱ-4-|R[mvZko%V͗>vz6MJ3& ãXN6[d kU- F*maOL8_zBoJ"u]d-[v5a.4hcCNKdwaDvee$Q\2=:b.:а+^z{E:r֤dU!![T*R19$CFZ q0IP:fC״^zM{JMj {XnY5QzU[ [MǷgE07اKQ֊/* Mv/>6 FL1IEMM]d'>Zt9"zKy }Okdr ߶1b>q 8sZ'r"r.Db(VGjŒ.x!H%dcu01J[+TJd@ˌC'dR[^c^EƎ7:#-+91+sg+%G᥵>$ZPdqv\~W"P}wr#KTJ: ɂޡ[Z ,ѐ3vFYWoޢs Ixs4łۖlRE&e)ߴE!9^U)ǮFREқg(tYa ^7#ӳQ(ldV"kV]@r.dpdҠR!/ptPAZ+RNJԛoِ̞?\E\H:n%hejFDh6nKJQ %|W1 ӓrYicO/zd*M@5V$ɭk| 9Jmvt_Fmag)͇w~Ly#Fv?$d%o絖hoowv]koH+.~.fd3sO `YrgV,ɦ$[% $JDbTuuC05&nVpz4=8抋R>>Ruj35{ufU7yp0b&̕A[v.PCmͤ* g;÷ ;Iz'UmHmtm'PDFA^L3h8vtywUHn}!`)*CH1إKߔ`b/EOkX(+.*`LY_?O\_هw?~H~8xÇOgww0z`\&G[v-w[ˮn ޚ*侴Gn}n7| R JC_pt;­VҢݬ ^q?d?Aͯ7ݞYIew3 Q!³BXepdX#-Kpc(NW`wY_#24zc,7y)Jx0!XNCBA?G"`oJM-a$EV 1CaW\G_o)d3&ܧF]=lT6_Zޕ󅹙\m6:x`qXk¸ Jj DaxFI9 ۄѪa~pR}=x&oO~#M&BW/Ƿ[ ,1ڹ@/Kk7㰝>ce; 8vR>- HҳPY5.3srN.ɫƯUichl#{HC\{:j Yw[^Mp. aekw\ozև_¶o3U /jz7% 񰬍Aݯ?jS}Sj%V]޾;+}w;DMHTlAsߧz xls[>PpeOCၱXhT V0b`GI;>[RDOC"A"MC!-GDpF:XDxPtS-쮶YA:$/vQ:a?.O46>3c|Q\ޡ/Wܸcva*~{t1NXZddH;\,^(%V Td0 ?EA~ȋNKtW7(d-*5w0}'71Uл!EC :u,g3՛JpĮ'v;/#4m@cr u1aի`wj]ԭ;\,ڛkvޕw7\!jfo0Gø~/gs5nj>-mnscnhNt}}Uק7Y_5ϞYmgs[+I?fI`.K5YCi,4{5͈YKnkJ*q, eK0&)q6W/\Q 0RJ\SLȱ$>xtRjz#5;"s|<*+\%i=ts3\ sŐ`wI;Ekf, 01Mпx=~鍆ܵ᨜]9x.,aR|mF0%, 1NJz\=w 5:)oy;tɄ@W} ߚiBhNUb1~OߞP+R[0EEX5;˜bDx~}m 9{KE$"_hY"MO7K)"AB2u40+ٱ@Bzh{HJ3$|9Pb1y FhUc1WIZ\%)+ hhUW JҪ7WIJ3_ҘvLI`y<*+koo9e+՟蛾4 6sCl)楷6(W00|[h+H;I 'mioOP`qAOL2XtQ 1DYǰ7)-A4ߢEI]@O<]:_+T?Q}6NƼ٬_QlW>W'᭻t$&*f2LT3j&SdLE6DG5BLiTsLT3j&Sd2Bf2L&6dL5Sd2L3+f&SdY)2j&SdL5[LT3j&SdL5f2Ռ2j&SdL5f2LT3ꑥګݥcw]f=X=qqR΍ك؃_=O~Q!2s 3s~ϙ9s?g3s~ϙyoD*;i)[69n{G6/I/=ƒiTB6|/sCUksVhO1-#/0W;W6ކ`r KS|V_zN vyjo{sx9},&qAٳuݿVgkmMvL6#=!-|]U_O+ 3~ %7"JQXXRV8dk_%c6.JqTQܦ—:2,"1rbrX+/B"IFJ["|hR|9̦8°J0I/Io4tUbMmJ(n πJ+!*Yx> mR~ o!=F՝F?,OFj4N*5 ]V<TXTx(Z@y.X/4}O R \ƑlBj8JtJq5NGo:Mݳ8gB {H XX.xkpɤͫ]ի0}! $:;{ Kbuu5;cP0#Fo Oax9MKI7 C. G#hb.E_/ľd?-#FX5!/>IGD#*SS*2nHbΨ@4g8 #[A&g[,@)vGE`.0 z# T)p@-VJ\D]^E)wj5t6NVn?wPVphc RY.8j1 [ǕF 猤!jYS(Gk&#LOp).}Ԗx93Tז3q[(e/T[(:"[m4ܤeK;5$w~ݠ0ˠ_Ns ,OXkT;By"2( %%v*tC'ZKR*祖N0+D{C (a:0-T]t%zmGC?\ltRrHh6.mRM2ɍgĐnد"E?.s>Ϲ;O_Hޗt9v5LD}3mz^-@[&A{ݰU4=4m:ⰬMZrP\x՝%H?&g>'ޞ-gC|Vsbܷn? {ۅ@Zדq]mݡϙK7ز^:G:Z6 ,.fQX>"E,\,3;1zt>f;HE QgleJ!gTVLGa(>~N]L!X)nH&5Y?9 Cǿ˧P~~zç}ff)JUH$q?\wbh4lhapk _g\#/Xu.C V !>w~=wGl8:WʢI^q]g_!I͏&mƺ,y3_85!f, ˔EF5tߊ9U r +oi}s`Wla쇌~6N&PD22c  ʹ/߉e;u#b|CkkS/[1p*zmFÕ5S=vtdU71B֐9'>r5˄PR(͚Yo,b&8Hc]jdj:7&wE i4a~;u)jKBWne56`i[YE뗷lR*T{M4QK2$PXY` ]k<:5h3J?Ћpd-i3eLI4&gJi`&Յwt/߀)DA]Lɉ p 4}JV֔z@*k Bˀ\\yh"*sF%Rl0Z- ":'HzM]6kBXv|fy, JрhR6KL1 MW4oՑ'` ]kG.ib?fkk#cΗ}_Lrcwxl")+LkΌ@f(F$9ʹY:b{r \(z{;{ӧTZdŒa ը{ӹƾm'oIpMa7襙g3-WM3+#=b(muU+ V^z[eVxLdbEó{A=8 H?^z Ev᨝Y#R/;Kcҋ2QVFY2`zVSșo[;;'ΫmDj3cӤS,<:wN/h .v2NReS\ l& ̘=V *pLf69k&z M56im;vOD\}aiیPqBprºc.G|}=ڤ Z@2yhI^Y$YCm(}Æ0RujvcAxgU+kwI3' `rr\k 3"\95Z;p0es[_/Oj640-nQ [+ -BM*um „`m\q{ઐ`[PͦUҘ^ \!{MpEs`[W\! Ro:\*Zz1p%Rj c]jt"*͂G +d|AOޤ|r;#׏~m _6uwQ^MǴܽ}``e(/cN jR]3=>ߦ4f<LŨɬ տ~~f䉻#BFEm!Xnl[Ub[fBMm m_lQ&F)J!ؚXFl|蕨TخD2+J?F,)*,ljƇ^ Dh,lSBZ!譁B%{P)Zzp칱2SG_Eʟ&:?ˇ~\ :JQd:zL%N?WP%A `r͎"iS ɘtTV$2H|̠Ti"5Q*4pc)a vo}X "ʤI,J9&4G7rqWK*G35ajω~Ӂw%yƎߛ%xE~qپMLaiւ\:^Kbp2f ߺ.j=ި[ l)`D^rcp[8dՙY$gÃuehn 5ysrY0c#>ҨS$d.2KS)ɯ3&t+f 0a08J.D Q7!ȫ\RdFg˧z B c2z ZEAYF4@zG-XWN]iMvU>C,Z|}$zq/Ա|w*Ah&K{\=07WB8!SݴKD6; iBscֶQdTF$=(cbzR0H]>:EŕpKt@idlx$Xؘd UXZ,|P,STm|RbK:DMNU|(L-^_{#6K'kwfaHebL1̜=0ͮP? A@MtPF 6L#BmBr 6M4Ej~\p.7nuڭ{wXx.x#9L!GQHX p$r).%9vg= 0n "6&wED0"[Dl)]2DmVr}0Ʉ@N.4P4_ضաVZMxMj {g\hjZ؊ͮ^/NJOn$kr#]s#e$UǮ;a7/t— oBh"+$UB_. .LZT_Ouϵ  c8ycM;K/Q1E QJO>~oU7Ӭz=q^oq&Lue<:QW'7&VC&[Qd{&d,Kl-,˂xe&N wi>u!w<;~HJrMYL6JwR,Rn+[`uݘV|( jD}uaKm]i;^hxOd4 #}hp2 R!M+{3ҒI.2$PXY- ƀ6D&^Im\ia^6v2A۲ے5ȤZx0Kٻhu:oJY ̮ SV*h)T[,je.N<^Yau49Z)X6NexlwmH_ewmG^737bl`i %ZNb?dYveʒjvUWb`t4:}n,Yydc"]")qJ7Db΂wDðqe#g}r]à?=FƯymڄs\'I|}C%ʟdc fHJ ʐ4e4*bƠ$"ւaz" zdwN)`8^dP=K|7oE@#ْ\}" Q+Tr;*[}_K'e6)e`4zc,3y)JxP!;B>G"`oJM-a$EV 1;0W\O'_ni `>?5D<|a&'j5Cr07¾snG|]}=ޤqXk¸ J8 D=$Hgр'3.o`Ӡ/F1ZleF͝QFGR#|t(wָl䬗Gq7 d;r{}<[qfmW,Kͩf-Vvע{aBB8 ". |ɴTH#B?+f꘭ۣVZ#a \N );0jvf}Hy>%,,QFe~]$H vvl: 3vJW&Gӎr?yhCq[QVpe@^JũBXJXjدЂ52_uPMr/)F*#02TD2'Tfi%1  Ayڣ n[y {3B$\r5&y &4&96>0kej6.u^=tw*Sy?= )y%c2 Nc 9SDNɹbNLY+;=)gNHͽaYixV3kB*ʥCRMk\ 2tgx|~(g|<1>SwznצR$,̾ik4קKqB։ۨ أqѦ7V\J5ӳ;(Uqi='7#Z~On]4-Gysb̗8R}7 "`/7* ݅'woB]#I9㏏t5 Fw,@}O*3f1趝np@D:G%hCv)FGhe$ ́}>M +G yRTOl8oOw?1}~x{ϟxgw{w[Xu30 .YvѓIi'`~C\CxT (S7ɸ4#wP|ʠZ |y;? KO޶ݟtjU Ve|VvX,Q` e +gI1$zgK(,x%0iSX$c.EERNqV4&g=Nz2i=a*?s9Ž/a6ri=TjޮRűGGBGlFӸXpBwt^8qYDk !e"x78CO5#?̏TIWFI!.f3]z;(G+w ~%Iގ~Xz_mw|)z@͵ˮ zZ $u-Ꝣauť5v~?+:"kcߪ,^nz9J[m,, t7ptב-XM0"[4WP,'ȕսCqm-Lj$7v8n;&iՃtmfRʚMTYuǑ]1P7V]l;ybȁNz+Z,X<]l\w0U;L @. ^1%6QqD+Xp #͕*Ae ֢|҉]ke㍝ݏp|y;gMG:"^ R߯nFpZ8ihFY(l(NIAc.dX_2XKER< ID<2g(3*1{.<Ƅd&`HV*C0!IR' eZVP2b=PkFs+%"3vFΆAnʅg*RGjz?~ u8ߠ5.M]XfSLYftݬ›w=U}sJ׺\IZ%+MB!<2 ޕhݩuQθձs68bAwsl=/\Ww[# oq,;\tdysWdLfmM3ư䷗'}mQU˭d3%0,HYKl4u/aN(E f-@>p2/"3z@2ID5`Ʉ1Bp:`!Rd* Hu q{ (cB)7j<۔DGEdQ0A[R.k%#Rs*Yo.kcLw,H+_>x=$r_=P2pe @~ Fcz!)s(HȰѠR ƃ(F`"0\|[ߚ8-`Dj *a?ePJx ie{)TA Ν(0  ]UxmӮ,"=ύFL޳óFW{ր딓G116Y̳zJ8S8&S@mҸU:}Unկp)?D!\%\MO#7`fFOVyr۬yO tgEE@(K lp+9{qի5{|*{\HS?6Mu`(z6="ܛC,ynhsZ_ ԝn69\W;鸬k`ןM]z]dtmx2nNnsROTw?zw5sQ[\OYp,~6(W00|[h+HE$%Fr.KpVĻK"[+eC}K7Kn[8=ޔ _e=.! w E}0ͮ㤛IG""Faq* Ds0laHjʐcB <* ?s~]`FʩR ˽)dK.r֋h8Vsr$1z'rhoRvjnSmA*Gu4a҈Pᜑ1d C-RgՌB})n`P>RL@Gm7 ,3AI֌YWl.uuW#g7iKt}sÐtM9'nPhy4,g߸FI'J,5JG\q`YCu {Cd|FӐ? ֒{* %^F0JL:'U\v2 hlܱ֖v`W]Gd*s`SHpHsn\p# @N+$2 iU3bV A ('8T {0a6rʨo6%x,1UfzkJ+ : ĤN @T$ 90r xFb^Ͷ%-a{I5^V ;>*{=Pܫ"B*Cs>)7^. `0+xM\WV00PA)3ZX&zG}0Y囇l2K[3hVϰ"P9N߆I |wXK׫?Ɨq'"NJPd0&͇1n̎'{4 km>ZԹH@HK3*V9Pp.`9V̦ףx+ | `<%OѴ_mj$g%+J ڥ!93 9# $g\Kr1,W;';! ͐)A !migeUAI$9FEtۊ[ݱ;{SƖo)0[dw$_2XtY:ξy&sb˦Γվ▱-:<3tny$c};]:}d\(wp)BSϘ8P F 1&+j$Gnx>};H^s^yc}>~_vavk{&G~]]u2v9RxFYf"g\Gar):Ei{BdLc/78FɢJPO`0ט"OH`IkuQH=A؇BXY M)#"b1h#2&"Vakod˯G/eA{XOGO5]v6ㅬU't>77㋪F.wo֚0n"@;zf`N %D*%86c=q9 *kc&PfTetۨQ1,%!'(jF}k䬖q7 )9ols` b6Euӯٮ,g,&K 6ǼJvaB\8s" ;ieFf;r{V>[YOoVP$6=]tފwѼyl5<VJrʅ@ `BKMW<r%:i] pj OلQc0+Cʃp-aٷ`5e9kS><կOg$Jz`},97M$39^'(Fx~0->7N\d Fs" Bb{-R*NƂ_EP$O-+~MT9$ Ão|AN04K;-)9IE$s2JaQiƝVc  B' LV +%C㞇A#׆G~U?)XUV gDL1ZHB- SCL£ FP߱q;pjiZG`0a h͂d8%.R0a,Q'H- S~6s^C5C~W6g {K1ք2{$}B$\jL;"a ?i C $.{`ָT( Cqm\Uڃ)WoLޔ{Oϗtf d*RCvis 0EKA)Eb J8u^6R0 8+: `5&5\:$UxܴN0 =wV88l>;u'66"G Ft򩮯D^Of?P)'4`JAWN6 ZqT9L~TΪi>uWoޗ/^ߌ/of8)1WKojlۙ"`/7 ͅL ll[jj47e > LV;Uos "I'Zm+@|rt&2>\#~q6NBT)K*V/޽y]ٿ~w0Qgoװ F`\l"I}3k7Zqzitm-ҏ/7_^^mI':ݭ`Jz3? b~U$uiI?"ޭ|ʗ*Dxh~N֏_[j#=WE̖T>1c(NYiR =]ށ5}%AF=)XGJQ&7n֏Lqpxwc|Fgs@l{#NaoX{SXhEY茺]Ls^}!Zcf JHJi nH[a1S3F!T+PE\K&xIP@vZ 5IzXFN'ǫ0`o& 9:/bqoGDk !2J蔯9v>yU,}4Òr½R=L_\>۬H?&zu/v ב:D@^yũfڼ{br~衤'Getrr|w( {6m>zNP]CL$ZPby5um^7p\B.KI f'' u' rû Ŏb},%//L ,ƲUD=jIm6j`GjۆK2]_ȋ N퓖[4Cr |{ۋ-Z%˒F2MK.2 1{Ľz9)v(6(V9_f{=00hIr^n{LեV7L}WTez[טRMTQVǾ]d9,x4 fDDC_OE_BU'uv(a{I {/@.ۥS\Rr 8"ȕf,8wJx 2kQ@d]k~\['{8n>IwZM{:邥\^wj/BCY̳ʩ3K>fG74S"vf-'=}yzTE.MEN~'M&P/j8݌`%NC@y7Zp|&RvoF%V`2(PZ`u%\Y(ϑ3p;U}C JfxZSjBxE›V#g5aa$2fTZUki %WEwSpe8 ;P4lbD< (1sᙍ0&$3QCZE2$u[X1c2b=PkakyCBvxt[ƉkzVyИQJBx}): (&p;~ WT>Goe 슪UxN4J"V&kKUń񖻘MucV}WWsSW/G]I*JQW\q0*Q^]%*u^RҡL\E]󞫫D%zJKgN]nĔ?V~ݏnNbqkiKomP.g&`!l y0N?fOU OLqe$A-‘ RA]P$ 4Db$m2>^9S B%]U1 /%+(rξ~O?Gc^7?Qԙm&ᇣflC /)iKY`6,} 7 #[18V.1a$ңSq>Ho=%kfk fGiu@J^SRwe߆A. |wCWsr0i"c>Sd,> mPfWڂ>~P d؇A5 m&1x4( rQș3<:/4$Bb}H-`Dj *a?eP* Z"R(;P`+(4t>XcW%b<o/]D6"]Mࣃ1bWWlC] Us<ݝ-~"XPeȾG;ü>h =l>1yE}Z؆?茞]MԘ/alvl%GD"CCTS1l⌺kzUuuUB/"V1ڍ?j]DbFɄPzNVS9y2Un~~ 7SU+y+Ge!g46(-QY/ٸQ[S!0<TF2) dso0i\Ϥjkjl׌QQta5WʺPw^CMEiZуĒ#cdx2!yl Ӝ8L[H} RinU#92Ғh@e, =M$pZxDnӑY3s#q~{6ĩ#&}~=ۇm dQں7UL^ۼ׎0>mz>r( y5AP8 ~IblDZ@-!ƥ`w9=Jzm6#RvE5rK8x45 wެ|}uY~<&WL $1TRf0e)&r msp̅Tqtȳ4܀*NqTQ\L9Z>̤ٟyrY\icO/y`SIzSSp-9%8g\2qv.x' ۳ɔ\3N>z ˍd͂OȌŧ-6pInE89ן'L=m }HD+x&|zu}Q#1cJR,tMGS壍7d+.K˃;(rO{476\ٕ L/|\gOC묗\<_\V]  %>^rq8[71kgLWqYe_HG,X,h8],nUHD*YnuJ*ߩI) )+bƱ }3!{%"~xo4HAAmTOlpvחĎO|_|ǿ~|~{:b,5ב؋IhSZS|֬S|yeG^3cq[ )?~akKK6hS6&y^Z Mb~5;ܟ[*!WU!0~@`]\V.'V}>}4z<bQ=`ey\6?@ x?cPTX"R쓘REğ~rhts+f|K /|:@@$*ZSё~>cBɉ+;Kh<53B B6z&7u9&'n&I+WwVԛϭz% ['ߊk3nI!-q?NΗe[:l$@f: yX.cM(,5&Z+:A u>T%:.&GOz%ȃZ#c8_끣{m iT& 4+y נ}G5DgKט&X &m}6ܦyjH7V'$Zrgoq;+?8h4mчևxzzÐBoR,n7X[Sɛ8X{_y^v7{`>~#v=]ܷp~O{K#ڭ~5 {*0+NnNRn8 N@ Κ)opXܭ* hhof6 ^g_אwdv gI:?`uBB7,-/7LkXzWUSgڰH7m Wdz~-&;%- i2q@RQI{ZɁi*aa&ą&an>V7P47'ͣ̕mɖ1s']0u.bO&)ϢĚߵƲgi`tOMx#ɌMǁ0'lx6ԅJ+ƺQtk_??lxʻAQnp[)pR!6 i׈$7)nJg@#J)^&~1֝ԭtدp`͟RjW/ouk-?J.HF3_ [/q0dPfXfCrcs&4:WK?Tr=xr=rrNS0I&lYcD/GO{N2ڀ] G*N,w)!Eㄕ \Dw*h]cRL7Kطt|ۖ[Sy~tR)v{'&pn mukvJfvs+Уi;tqkImk_h;9#2_?{WƑʀ ,V161:wOE$+!Y)r(R#!YStUuu=wJL겾ťK{ ʾ]|]ݓ7}•wh0yӳF_Cl 5_`oe~Xzfw dcO/]cüZOxb{[e~ޠC=7Wɹoޠԙk#>90p`F3:)XS_ry G~6W ~L}J`~=A}(l'.&ؿo^m'-ٱ6Ln'%xR$d- gqMYNqYj1HpSam6^_[Nu˳n87/?] uRUwVa~IsM۟`gau\|Oׯ؋W7:DMFLkOOnh[ \%q5?iɮu{JR aӶpr|*I' W!jNQً{;2B:O]qs:WQs1臥审A/L7TNCeg▱广n6 c2 OlSz1wi5ϊEw85i0>oVh͏9H+W`t,p*IX'W Uc$8XFR WOߛF_a񏳳NQSIv̰[_}drλ*3ʙk;/_&r웋N% n2vQ~/^!AU"O'e'aGb`˥w^4 ܍&[I ~秚6o"*smg:2Gi,R\a߯([4qB@\^5Mw5/@O_Jmh|9S$Vmb]zDQgSx^tRI*oxh&^ʐp:?\ TFK 7xlY?& Tozu^ΣS f f3ɏ0m% ~GerD4 ̪Q!fXx}آ3i#t4ӑf:LGH>:ZHsŔLGH3i#tV1*f:LGM.H3i.ڟH3iy`f:LGE#t4ӑft4ӑf:LGH3i#VT#t4ӑf:LGH3i#ͩT{E(V4'g:LGH3i#t4ӑf:LGH3i#ݛoX(ږo㺌wtRiÊGJK*Ka,O???xS#czU D>.9K^zkr%3Ïp')A<[yUhUn59 ;?,@?\4΍}9[ia`ɘ~z'+Qgoj&iI:!tHx8'2Q28MȸQX E9ќy<'v 2$5b"jxT8!,I(J[iR".ۖ8h8Uu^a c:!L'3y0-#AXVj<cꢍ1He㨽0lW*3:L* 2Բ(u(GkQ"JH1 N9ťoW12#XP1g&HRiƦX(ZBp۞g \t_\3!cubw~}Aiz^wu:$I~/:n0 OI&N{ ԗv^ÑU¡_`u;] z#MZͫtNSRΥ J2(HRqV͎ę%|KF)Xҽ@oU]$yݤ]ǐ5*Cz _jRz\TkneQkLm!Y͏O.|5Pdb%3UyQ\:BAfe6pNTĴ=xֶpbr8c7ڕ9ӯ'nK[&z6;{r @B҂#`SF/IҰqiz/aT^he7́aE`b![ZcI"Ze:X-XZ{ji&/~z&2#CYyYL@ cy?}+o!mpeilxSl7F[loJ h:V`6}dPCi<Pr.`s6"k1e$"_jY"M;I֥UxL-4$GrDieD醞 hLޑL٤ړIŸhhˠqnpy9'`VUmP)0WJ[%J1!Q||$?û~Ux }\6Y/eH8P?\ TFK 7xƷF=+p/W:K\)qOF__\{VA3@3^S4~G]uw儯oz=]?:kw՚ܹ(fO,pfyqGB!D(4BB,ZBR`D$49p.䡗]ȃ-jgג wDmgTܢy)2[L z1tP{S8 sx~C/7p;c(4CRb.-M 81($h8+o+ n<;q/T8K ,^pO V^boosoO?wza4UҡfYe,:~޹-8:<*X2tie$:vY=o8 юM@>E]1n_=9 ,6,+wKIi@Û_~' qil*_)B"w0eYpY!40ʭR|oGǞrNB$H7ڄ]O6]hʣUYco/p uiVHp jv^uNgԥfvY>³8px.ݢ떵3}8bPOG$ xĢD_Q~ !cഈ-ma`[00hNP6X7],M./;n/8xP#կxHw^9whO;KIy}U){6y6xMk?GHB^~/O {K o I~j`OCQ؛zq34ǧ6me˄1RJIiAkK/ N[ eYhKf:$j;0h8QXkd@20,#XXBըյ:$tBIKX*P ^4!13 Y& =9@K7K뒠]Wœ[Ɠn,[lNo^5ά j\Q,Mʎs.c)E<z0:-ړ WO\Q)Fpg/"JyQƴKjEC8Y ҵ mP4TJb!0KԈ H!U0  M[g8շG0*Eߥp!:C@KS"fpk)S 0bSIkQ1FX=<7@NYkmH_? ~u6  ~T[XV~0 fZ-&yeAᱶ \.Dd#K):0(N<`K]ft`c z6k R+[nyE] Zn D [$cnyq`Ui I˫6 @"v6ҁ( G>}r5CsSzpj=@EЪX$"CqjH0 uXL׳WmҼ18'TJd t XjC67ʒQdh[sϲk>!W3@r0o0AyB2hLFG";i[UcG-Z}׹C]jk u~smNGV]8$Wx;6FY"( զJ{w Sg NJfϧh {K]EU?l UF \*Z)S yqsSmM$~6FɘM;R2Z,Yuɹ5NƑh p sPA:Ƃ;Tilh0k?5d#l8 WrjFm`[EĚdDzŐכ0NC?Vlş\VWzSa=yDxyyZ/kdjH@uN %Bwj-u:lJF-f0grf1Wm\=pMn>>zr.R_֫IEh`L|pzE\%^Wߖ5EnNf|FZ .rfSүgW7\AEyʜwW7#ڻ~^\y_l~epv1vMI}8.jXK"fg> qwDW Ħ8G:4< n2eA4:Wb'jgcO[&"qT%nu5Mn+@yrf5V2R6>LS}ٌWTKy EGz>-xpD& ࣡0k54Z{Vr3n9 ~,>h/8}Ym/~fRmu:OŮA8( d,AZ>שRJ_|$*3l WO]iutM<]똭©nRD};C2\<'}GX *>:,&9T謟q-E?K Y%m1(D.jdI6GPYb$/-z 7;ϻ&x2zF{Rb_1KK%nqG~| 2,!N+7/$^y>ڤk po $c{)= $!,/9YdBFc>mdn`/Yf hsM+/mW3!-@!urج #^F0+; 1RkX"v1DՉ?M"!V#U(F[:QKe5(%"E1@ifaU,$E66uCicڄ1PŘC, mbLR"(z*F+gyE5vC5#n7u-Kö8['͘3O﨣s n}ȭiẀҝJ?7=y&cЈN<-z.Qu$SuZx@zS߁!!ڈ) RdcauFzzASZARtH|~yoJ1hk"^D*1ޱ$,9Q`V6YW6<5:NMfnJWb>t`j1?clwwYݣίu^+]I[Y;@dMQ^%w:T;CKK l&we59E mMցKfS,sF/bR_/* !O 䐲Qڶq UMC`"SY)$AdqJ=v5ٌ{T?|7ʌbn8Lwb3>*ܕ2&x>=3P6SRn?a 7V%䋫r?يo~ ?CZD$N%BZڃK c2, DJ@;%P\۲ֻsQxxhXP].U})e+RTk،5(yJmum ^>.\ST6? mkb;ov>Kl/nh4m4|OF(d6 O A 2} C*RY׋w;\+R^iaXQlh8;#`7n}\HۇC>S8w!2c}"9d .mIrl9E"9Jq`8 ,õU.. KqPr` PL0dE Ъ&mςVB97#!Rw:~t:$igwWi<$.͏,>z:7$Ӛ!̞ũր{F4,@ia.!c@K(E 2 M >`$cIBA:IV3%֧$ȹ[oY_O'yTy};.fbaan+XRܲUqC0=_uHO70!LX+8? vA[ѕau{qL[RX9ͭLѧ]g@,9,AasIƔ &YW69` ,%([]Zd-R]7BKti._|ј'}jMg_O a?}N>R“+GυF2EUWF4?y: I4C%v~E W/~Oބ۳K1~` dz TN?|ɳ)=\O?}è5+0*B4g)B3euA䭇eY^!g_oN^.؋?/ Co` *Av.o 6߻ؗ,y,^6vQҜw^7,ż?.>W~W^`qߙMdz=~p-jU{ :qp>9I=N> k%R67y:)w n}:]t^YrV] 2ZY[>-iq94;fJCr%-xnS([f"A u1HFtޑp_dًm@CB)Z1bi.dsf,' #yPRN.lh:-OWs Cw^xXRܡأa uSVD1Ժƒ] e>f$NY'J{m&imI^^fV!mJP.(x/T$!kQDҊJ)6O3SZY"P 9 ʓs(jq U'=2I"n,gȹ\ml՗F˻U]|)T%ʒQ%@Ra 2:yx^cb^j5Zi>%{;yn!$#uT% ,#`}x: IHh'b2+&cZ7/5fp+O#g@Y!A%eY#&sQ'r^yD)_:~;Sl+|a"{̒Z@ L"˂cmwmz]b?{Ƒ/;R/uɞ!U"D eY>8TIqxP$1 zwV2q'U8dW-K~Nл2m +yJ>IsLKC* & ]l܁b"MMF~ 11JSx;:]Cg̦Ÿe I"3YA;![DȦh0<,`(\8s" bZY F.v5;3[9 rbyWC(2!.bM\PlTh Fx$#1הtlᩓֵ{G ¥&LI8FҎY Cʃ`i XzqFΆY|~D^r^m,aYZ4wn Fs" Bb{-R*NƂL+TЂ;H:|] -(bTxGg"%1Q* J3`GDI$qcP "pf*$h@"c-RH!&PA Q #0_>ښ6$Xla-F%aZ Yi>%NAT*9"K R˂۶5s7t܊6g 㒾wu pfz}?vI>5.'eeNWa@d?}gN/%a(o ?M>2ϛ.ɓmV9*~}BSxyق1\weΰc SDn٥bLY8+GoB_`#ÀӰv3kBwɥCR=pJ 鹋wqy(D>tv~|mK::|W*D 4?O82kcvn3,bACiuF2NNETET`#\yNjc~hʫۿ6<`(hͬޛ|S @5[F<֞*ZQ:#||7# 3j8lUce%e`s(Jc1,fCV*Ժ5NBw9:A% ;Lmr1E=sY\y>lQkS/c̱|4ⰴ;kHE>`N)VPH]%fjz_i}-j!hm.rB&ۀ1T)b+t@j"ܵ)Oc)b XedۄZrƃ-槫hÿR6d):BུރBgY6mJwwu1/K}`m?unI}ٔfxӒmtvTTJq|?m P*D(\!r -%& Jv߸i).ȜȧZz噠>F&ԚSZRCYHF{ u\Ie?c"Χψ<iI S"!&"6s$#-{Z#g tppLS&ܡp!]Ncݰ2KQr~,&o^x@I/C4D#%u2- i(h J"y0>QZ0 L'o޾z4ՙ^CWvʠ%sh3b,%oc${o@"A?L,*XqQ]`>W2:ⲍB10 f1grH8GXTg|>F:]qC:? IUp;oB!gh#χTa<*+j$]J#7rwQ@-F}Y8ieZ}}i˻^۝C!@MˮА {{n{)Čn6kWW_|[o}3:(fo۸aodQ@; AɕVK 8]mk8Nonlu/Ke;\"`vy8\; m V'/,+Kbxy~cj`Mw6#j-mܳ oAdKm:.t:]QI-;MLn@Yo/|(G/'~'-LS) LB}qN@խkEw7 {_j[g/$WRI6Ć ZP5;~3[LMTY5ǾyXd;;ؚwڅJ3u" tҡ]uq$*y@nPTٝ8Ox[r}ry1<0,%)#\iƂqiWP/cu9ZJC8E)ykSg1 afU76^ZE͠ b!z]m\Q@R6cgARꘋ.~_ȃ1?hix+-U H$S,s2 c31aLHf ]e&#IDa S띱VcJZFLjhnD][#gàڱn>lC$&j)z?>Ukp=ߠ5.ݱ- W@iͮiޅ7ozp;ڴqJ׺\IH4Qߚ{CUyd.&TbNo[<8OkvUǗ8ueTUVcWWJI:u 'HqB 3 ]]%r>u2̎]]%*i^b` >Q֝jٳNzg`? 5+:2g"P  Ľy`$LSeɸz@EꮖQL~<(C w׽[>#O틨;g` uD9l6gF\fw:VVkݭ~WI YGH{ϵ,wJÃ&Ւ BSr%59HjDO&j|>$LT΂}PU"XQW\NE]%j[@7իQWbO֧pS4멨D$*+TW2 Db'j1>zc0QI:cU+{'P?s\Ə0Ĕ9g߾&yz@iEPZ&`o9\[ޱ]dTv|+)>ߎ\{|;*E-Iwd-#r|n jЇ4䊴qѠRrƃȵF"0\*~)YiSo)FEj * A)ታ>("EPR-Xpts0 i!1+@dHVX 0i޿fv(HWg<7@%W{&^g1cL;:Q'Cdž:A\uDK]! 4#5A &jD^G9'$ fDƣ$s&Rq@8D+m#I]Rcqhb^򈔸H5IV/OdI%aRU2*+/"##,DoQ8T 쩪I姄^D RIkS *c9 *5Eu؅sdW:L3z8ɳZp#'ChlPZ&2% cFF`(iDٸQ[S!0<TF2);P<(͵gRR5c5r׌J5]X36Յ.ԝ.ܪ.#jzV֥kOt0Y= ?> I#ilEUڦ' ’ɓj uI+34g'd4MIj5 L&ߎ9uKȹ_c0]r1wEkWMM;{gxh|œDФ29֑9IlFYtʾ}V@2!CEGtMB@a&H:x"R%ɩa5rׇS?qT6wE#VM5m;x'+BI| h!Cx` B:W u*Y}ȌBCCPT4$$yN&)=pr|_f]oHjOOxjcђH- c:xbkJNIJא3ך4Jpj`pDҠsDҀ a[Obe5r֨C:tgⓤM }g\Yܵڏ ٔ3WKгqUa BXxfA?rEՆb7Zg`dR͉YI:utkK.N;ΰ\$b'$!T3.m_wr\ҒI(èG#sJX` (lc2xϲF2 R0J 8V,0-"pM$1de9FΚr/@ZFŨt1dip&& S>'T\Z<l h4dn|9~t"PS>+5vRvB9(.e0e)&r msp̅Tqtȳ4܀*NqTQLZ5wn5Tle[+['`,Y̢֒AfT,uǠrQ)`@9މG=X*_C6WY:ܷ=`u}i/iefvrteI $J"!K6ydRZ&VJ@Gr,g8$A]J5r5sds1ƏfS[p s||8V2/EIZ"lCA_J fϫ!ř6IRKޏ7o#O2лLׄL{%gK&NޑRrd'ap7<Ákrtr#Y౴U&2cvr \iB[8<:,\i!sa9;?|!~n#uw[E2%׋k5fF(b|ǔv^\*rqMYF6VD (r+476\̻ěbh?̃lnK"bgvWc?\NU#r$Ptjzz1p(2ŒDO.w<Q{Cuճ *gR6BFoq,>ʽ{7MgWTOft/Oϟw?o?!_ΟhKU$(`~{n oMiCkVθ#8}CۏK z?}y3L't;_,?ej6]cYK$'\V|-UNœԪKb3 *.CY[)};}h5_lN5{ioZbqF_2 ?\0 ^E҄/1kƉ ֏D?r+f|K /|A@$*ZSё~޽mBɎ+;Kh<53Bh1%nw R͆`b6zc[Sx+os=tɲvVOo.JMT<3o| hO9hm_z«Y:U_;?yi*tg<Fa['M%eYrHc?~0j.dQإ8%h)ӌv^HZ^!glĜWĩWsx@TD}Vҹ$t~ռls9a.P׭n~0?k7f1E-fme;ӳp2S4@/?fUf LjG^wm$zčsƖG%;77KzdqlӺsR~Lǧ\!m%x EMV᧮lW!q7z=y㔃Erdsȸ4N!jq.ifLJ9T8l:=ޓd?nىH+PnSyOjm R]y)m>=TRHٍe6^6A h΄ЩAJPjBk7U n2;R)HL L9V9%Y^@Ev% 0De J5,uL%A=ē`΁_bPZy3#`LQ(*ʫstxK4LOÅRBb.%Yc6`OwR `.YzVV8}>ɏ48viБ` T/ͥkоI `HgKט&X &m}6ܦy&[j+JmHV[to^beBlek|lNǣ/筘8X4>}zЋAeknw#oB2i FT̒ېg tiAI]3>3Wmz[jM3臻prN u_~/pjZX߷^8?]Is[9+:+cI$C͡;&c\:$leEʮ'O6ePd$| K /g]R ~s~soYg}8]ljk3[4''µnVY,g_mcϼ3c|:ǼYWرSM f|z p/YHw߮Nop0 I Pfvf(Ot~`oy0soljŽŢt2\޻4;vz|~ZsM秭Z"gZ= vt&m~λe̗Ä.x?otE\.\-UHw.c1ri-f!6"{53'J[iٺV~W;n=ޯzncobK뫻FV~1q^s|뎎Kb7gy㮷_7ݷy߽r>糶wE1vG+thmO['w !4DH "AC$h;9 k~6EšK0ӊ3c/֬40E *jhmסivԚfCrY(m [L*Z@NLp.RqPۘ+Es](=9Ξ,9̨C6@ v5  &E" ^SҶzTІ k(|Lirb\@!ZBrpJK9BRABhD jGx&==CRU كe3(g* "l0PRs@.ȳ[p!;5 ,\W5E r$TA)UV}2-5o~*][Wqk*P]j3ZBN\A T(b5m6Z +ڔUIhq#l%#{b9c"pC9r(Gȡ9#ʑBn 6ⵥH/A}U 3(G9( ^*O'v#פ914ҘCsHc'|8p1?8~:3 !93p™C8sg!93pcmqO86e2ƳԢO|!R|I)OY/LSHh&.{k!:}qv!l+vmw DT3:kO buEPE\F;dG?4q?w .͏kу 'QŢKj\LTEҨde r$6A8ZIx@ ]%+;*M֙!xc%ɕ佢[9ZBKyvwvcA>e~f߾QIjeOɡ-2PBtʎ2sfP7CWdD򵔚"eR6=)P QjQ(* qnXmd< 3ƒrZؘ=y%H_6g؊x Os GF]8sU1ShNFĖly -؋(i uX!!oBUb;% kFnFtq^nǂFCQwFm?P{0m#W˾ Dh @TUST\[3ڷQ@ ;ա9 0C EgPhPF!Pb+*KP-w;a7sv"_hx,m|<3"@ā[G|"L ȥDtRZL9*b2Ugo.c*,0>EʉKif?dF(b?~< {w/8R@amRvVG~ G?;ufyW +Tp٨DIˏ(Q{zu3g]zoi덯T^ֹ?Q/5u#J;z\П'A}  84鱚\jJh8x[TC%$% 6t)¤*c7 qwo?u o}ZsXkכmgSa__,,CeŲ)'s-sx/`T?^"_bO^h_Y/q\\CWMom6rߖeG&׮d[` o˗emt #XQ'|y| N~o'K M'. yOd4|82dE.>w{̀[sȫ{_>;:\Xi"+s58V_\0-`m[* PV@'AmQgCL_X>:UL]Yh߯ϺR8Q<|ujͻ/(("4)l[ɉ*S+ I)y8*| +ou7 7<^5 vbrc9扖c73^G6+ TD. .Lo;|!q7MNUKaƵsK&y輨ֳ.sK!<2w8[Kt[blyŧkNTm!J!>_SpӭŹ|֟?7v}uXrW&cT ߊ;*}"%Y:qlLqpQp$~iլT.>kPQ=ƩUHX>];iP{rץB#A-b%= \T8 M@p$˚¬(ƪBnYHab>.BҾd~avie׷i1}Ov\͍ǭϳ[Y`*.#@kFeexVoD%I8z x(ŀl!#PT)ƸVhDYVafng?nkֿFm4Kqň,(l@roN痳D9N%:ƂP E%\E5d.IB&|P5Qo%|4RH,Q1`T` mV; NyiiE5P421ⴰ;9d,Uu%T +IB52{pܻ|aM om›Uq_n@l,>O4;hl{&0'$fէ` |U4/me7A))y2"]rpͽ:f^#oZcoes,kLԆAS;k+CrOorQ|/YaWΪ )R3&$CbTo>9HB ?f7s(+z4tRD?|'ƙ5_N+o< =͓)%|YM  BA"ۘG4- mRh]9_"LGAG$uJ1È !?"HDnZ hQ5蝋1"leDwz3:| Y0$YDbT XdY`ߊYQ\{_j>f9xwc#ƻT-7 ywÿEy )L #G;_s]yd!$ډag:lBFZavcOPEWdcLr85$-(P$׭kQKM5/ kjS)*Քb|4AkfS4^n{&S P9JJaHGb>Ԧ،|ɷ#1;/k"S/8[>Zj&~^uL \|`TUJ& CўI]96 jXF}H4~8M5B]?w49)3&r%WK TٓIQxD pbi}N?. ///}Q x:-g}sĈ*sXFjb*4۠U E[71Y[Sm mcl)j-³Zɹk@hj)#ko ҉#HYm}*c\Ϧet7_nV\;7-QiO57];-wX!^H{?ҧէGˏeFXo|~q›0gs>Mfu:q9_"o>fpe Ւs-fTw3xQ6-Ϧ=hsr49spVu'Z+(P^Pμ=>[>60sWW@~cK6Mf˷Girǿz|?O߿}w(ởG#Fତv m# o@E{4jL[4mE^ 6>텥/Iօi~gk6쓺^Y=&Y!ʞ(`d",9:e!%[Uv=X,M^6vro6W~VV]t;euܟzV,4 xb0 Tb 7Ls_`o'ſb< f4ߔ_ԫҴDh|vz2=bWUԘc Bbo^Gw޳|Z{E4IP/BY[i(uNԮsDąq_kmAUHq" =TFT253 FRM %=Yǟ6¬9bs9 '[mJ/:Ƥ6*UZHp4j]3৕[ՔD7L?~Uǘ h!4f5r(  uP^.#_x1 (]áWUCWOW %ў^ ]9sqݧ+VO]5QW ӻNW %=]@b㌗+ \;u2:]HW3,`CW ᛆw~ۡ+a$?;] NWy^hzd~(iԕ]=]=tEnE+vJ \;jhuJ~OW/z2 0-]5BW@kH:]5HW) փCW ]=]@"G™K%CW Cv~4HW~_𡜍?e{ʞ3]n߯_>07舡UqL۱>Ԝ{:3ug$oh#hɡ!mgEeHWɡH†V] %^@Ih[5GUCr骡麧CW%K1WfUf骡겧CW$tt{ ].`(tjtPvK+F!V sz(t4t$Ԟ-S@UfYiѿ~L`m.% ˤy?כ~vGz 'KJGt4LW:mtk#Z cO.nN$-V^3konKw_nKaR\jW97=Zf S\Mㆨ+q;rT a4ì B癇یwL#¾!J_E:qVkuFfȨLTA$&żg*3O V*aVw Ӓ?vqVmꅍ'\>Rpɉ*;֖FDD Ub.U {`*HG'12$EB[2.ĘUZUfIՉlCzhݪw 4JFa!UmSJ!<\ys0Gok UD4TrBa@T'Z,b}uR`GY1'LDE*t#{WYh]T-k|IeKrTںjQV@FG()SyS$"D> 0tl X YMccR%J) ζn%ȓ/o.2;Q !c} Ф4IeXPA+ #t &^etȵWYٚ$0Jp<[X!=.@#,=odReC*Zu+GXyK'[4n A). Z=ovBͫtJVBC(%)ȌU{Z\2%dbgQ%=ɇKS1⥄ś8~X+R]h=RK7}h`OJk\L:%$ ,ƐWj}cF r'XudU!(DGkE".DeE`BR/#ޅq .lzS \DlxHLv *(:Y-Y?ehm+^vVnFj %Jb*jl %c 1QT 16| m]` +4Ņ%xC1s11Tl/ C+[ n+CPf-a5Fj 2TJĉ;(+B *.^g'؊qo+TMBJcb$ J@F3 l҄ҊuO}э`fH*z]|H2b3..)fC e4 aAH e`5eeBE@%2I$#@2 ~ȃ!2#ȈÕ-`y(U1GYEH#A ;]nIO3.+asAXc;F#aԲh2")R,)Fa2y2"+ ,!uD !Z> ta,-;UG3 Y4@f\ iEz$y YX-j $[ 6! .wnX)]~.!ּ꤄U.$aP FCB* UZ1jH9+=c%d}uW ˂*H@' º`QTD1Syrb%՛ F’_u^!6򆂱1,ɅOBBUD &_P0܏=B-I{1 Z&1׊37rD6QH>Ryg4uA0iC UCx i?SXA33-LuśR@hSFځXKȴa%Ph[uH3, !`=J{!ҽ?SprhoY28UsO#FA&e9c!$e/? BAlZ#UUu|vD`&'`C4Yh  0 Vi d\Wv@z'Q[e!|P GA'y>(`w6,qk!\0V40?}c Qbpϐ^u=S@XDmâ3poA3ƞt &fSi`d%''F*ОҞBlv]$Qyqʿ/>8ӛTKЊ}$hRe!aD/DTf%sv)qL Fүnϓvs)@UtpiZhqظŴi5~/ncH~ Ie7 EAHujWӛtkum8,S~V|?q$7I #};yv|G!~C|?!~C|?!~C|?!~C|?!~C|sqy> XkS,Dywh;->wDZDV?h; )/y*!a]Vyi %5m1M`C%2қi<$ny^m t ԗs|f`p:vx20` (It0/{h.yxoM)D3#SZ_ jBD&pєCfw[;;mOwdm>9ٗ-f|I/n Hsq]57d~mW~a2%}G5,?\.t7r{CwnvLnvtUHR Ş^ަka{^^pX4%nlrú^vqB!VOVohw3ryV)x/p/}x\:~lޒL:$ZΆaEw}V51ϐ$:C8Ն8KRdVBE BƬ 9`y-!5RVeH@+2䫙!y̻PZ?}SbɫX1UzjE/!9$h+{?Ž!=[˘/?EXǢᢒ/)@)͖I],`Td"RDK` &- "TGK7F`@`x``B+c-eAY&2+e%yqC$`;-Pޝh|fW[v>+iQMFDw0Ng;2KW75sR6rI%9t8^;{&٭WDŽQ<͵l&\G>: 5k0UנC̠k h>äw/d\HR;fAݳ!sqlz@Eva>}Iϧ)|뾎Vdļ;g1!^;Tú\<Ÿuk0K<_Mر#zT_e*kSg)SygNsE(p',ϦWgM#q9\LG4k7| |IXr}cG就}{:-ͳ<AOi E x7%_p&-4E}dG6am]a2Gk;4{t׳2۞.>Ia2jhl-[w3{:6q]X]i.J9vw\h./uqȀ5R~8vKc9r-.w/f0_lA3oWOG4:ܣmH;֞> y}z zz=\jvmҽz^iBpX\WM)"`rIw˳TJfѺ0|6wl75z^7{ynM;[}_#n|Ngz.;qtvO6$Yq>>\ m*zW_Wk|4fVf*OhRƺjnz(q.&0ģЮ$2H*<OѲP!m/ͤCj)TUc_cھ%mQ1F[Wǔߛ3J'p_]־F\5EKVVzre:[oB1t}L9+KGXbD^5 rK{ߣ_<_''ӏbۊJt*th~'}ﻷU˃bxVY6`lz©- m[/i=#zhzhzhzhzhzhzhzhzhzhzhzhzhzhzhzhzhzhzhzhzhzhzh/GzuIޝuRn07/?;Ź-r<H3o+klۊF>6~mJjQʷQ4y4j|tݱ|ƍaIF'7xYC B-u JKJrg\u4%嵀4 M>ye%d*ė坘#4Pňn|Uݲ6*U3gn}:%2sxY-ϼ C&|d Y}5^jxAl?:er'N.-kr%_KVCG@i<2Z= U^{)zN!!ANNƨȂ@2FF./YxGIl*Q ts ymSvu5~/G|"Ba!B2\\D=n).vF/_U6آɄFD7+ ('ӎuSS:aLy-NPNKZΎnDLK>Y-:EPfBdPtPdG#'jPZ@T$B3v:H )1YHg"ΤR.DU'j N_ʼnZ͜5]̺bK!~W+gX6o'wDq6IQ_ӽ푩=-^YE?xA%6^*a6유X+4Xy(r*2jُbjXX3BU bbrZ%2>nˁm_4hMB9u:c3@21&Ec"LcϢCFL!a_P7< qV`+n٨+mf&aU.ܑMWُ~:)橠v58ueֈ<y.%9@E)Ye٣cE#ˑ U9RiT@ !Db p)d1p2V3g?vI&f 0 "VcTFDa|1^%-5,zB3W>8baTЅv)UEJ9eGUS%⢭qqqs4p;EQ^I:c< ^T,(ȥd)Cx*xX;5pxv˹BP[#?{WF <"0gznb܍y1<%(RMRod($ReY_dF~/mIp]3e?Z yjvzuѥѰӱBd!R9qPĐ+}\wI. <1ȆI>G9k&Ɋc.<[\2eT$ Ф`U\[U5q6L{ 1o/ ܇7]w7 Ⱥ+w1=3~}<]dVvD@H@Y(1 } +eJ"G&2QiE nM-x-zv<!cumou+\S"渢,)$le튫!*Z~B)$dIUK䮣g !',Vw;Iw%m-x(J*d(0r d+Ԩ,2T( nB%MjlXͭĹ 8.7=;_Kpc2/>iXpz=%ű;]5ثf1^}&jl;!dX~f9ae,Ad#LVt!eH(Q1lYjdD8Zq48¨tbTZY>AnEA8)>2r,|k hZ!3 Z{I."B]\ыTd[3-߂bC`0O`6)%d agˌu<~;`oږ׷tԌhn67sYY>!4>FbG@OVm[:G&ت`dSM}%W.gRNoq(ʽwxo4Coy"cpv'׏Gˇǟ9#.?O4F`M M" {zn4ݣiVޢii`kߤ]fڽi>V4ha9$o_ƟPj}[dmԛe|5k8"Rٽv؋dg揨u/څuejN\1n5q*F$7+uVT5ʹ.%`nQ|b&@FKȌ?6.֏$?s~A]QX~tZJ1/(T "Q +k$.{Ց޽S?^a =px |]u$r5gYo,b&8H)bۄjejW/x6[Q{ z%)t C'=T[-^J! q?L0at|>-a٨,@XY` \k9XwM٦LMFީE-ݙ>ͰIZAOr}9Iޔ+r1h׻mwM~nG_ ȾnV8:., WN;KEVlPE5@5uTeB?R~Z}ӷRF+Ô 9Yy)\8,Jʾ߸hv_y !DP2 (Y6NHt~FA{&%/wufpu\8ਲ਼,R*YD@'E&$~fcwKk-lP5q  f;I{re! +IK.S{A"E>K(0UP=Jph=2c6* bJُH%oY;]vmZ_Wtc=<| 묗&t0yj*qravY'm6Yz0e ^zq o)#'7WA3cir4 %(ޠ{ Ї1k!gkVqeT 4`%7>&eVP <*S2-ٰػPG_Փ.ac6^Ǡog>ON7q~/!ؖ4@O},X@0s "!cH!DLʨ$+;(\;9gp ;c"Rzn2hee\fD%mQNsRKou}G )GyZ+$=!Yܣ+1vQtjJ7R]S2K6L/鳏MQ_LBoOmRyDQ̾uUV:M{?`ZmZ+to0q Sgʷ =3c,oj{_]joK!z;ubjBw5^"&"$m2m ^VB.qA0|Q\G\+|ש8Ivrx}X9&7C_wPԛxfDidg1/K]}Eq;$U&MX~~^ecƤÄy&vNQLֱq߰ɼS}f@"jx ȱ֢]}Xj ڈ}'=OKmLX efZ(F6u2cF"qS8‘wxL g\3~ۍy3WGf*pIbhT`>[uIU"guDn2\^W80fۛgjm5B.dv y=볼뛶9S|G"Wą'n.|-CAΠ;ZX +ۚP* ɟgȯL:׷cKk` )4Ѐ؜$ϕr_*N<N9YEf^9YEe $Qr, 1I'JJ' mg^TE#jEQ';ːqBR\Ld.s"jw; BWF90ͧdyC|vV{>g?kn1F.˷Uq x;/}b]{,fL‡O=YsGt;I뢴 u͍/@h/v,}etӍxsaڮOky?~pk#wl:1$g '3qx=9_wa5{}E9oM}=RC-6%j'ԋNMWbBK|]_ ڎ5 eH)lN7Χ׭)7^n%Y'׉zV:fTtҵN|S7ޥybyr@VeHO8-h\?ю=86?MjAí}۫O>L&[F QEq{Z}8SXu& !26BF\$VeϚ 6P71l Ɓ:h@rA ddK-3hN"1&$d"gV*@͆BAk;GM0a8Ô0;'c1E<1>uǏ@!Q*؀GN_th\Т!5F }N`:':>d2r,kV &Q+a2&D 䄷XLCq-*`fZa:Ik&hjq6`wf'̙A$ M*qCHYhYlRG')kem6>8LȐYё($]E-Pب 8b*HF5=_F[QDZVֈvЈFP_lB4eh<03Vd8_ID&G̴Q.ke2 r3!h5ɣhc$KZY GA褲FF~UҋeSuVcݠ%׆#= =ct Dr$UBH>HI)$DKSчոH}QC=Y X8ʵ _+Q},o]`R{SxfZx==`^LI?\ ngOMg]=u h#3 UF@$厪/9So\|̝ |;D-(D d m*)rd*${AR wf.8VQ2LStڢZ!sB87AA g$kU#<"ҡ''I+KchʸD!*6|`m浏*sn@tOiOT6DLJ\2 ҃"Ig{mIQ{2]$ $!T3+uk9 mrIK&B 6\ KVyrdqɂe*ĺѸe/B;]@ІLhc%IdeliDNVj(g\ ٲ F_Bgs<69$S>X.y hhATRD|VWW8]䠐Eʉ2 ްJFK:|JWe3@jP+.Ƨl^v qhC "ȁd .ҌClV P d≣r9Tg],  gI(<מ%X<ЂP`bVX1AҢG4T m<"D dJTccVe!C`ɲ@` 2egǬ+]4CS35B9 ('+0vRªqȱ*KF?ޞBCvȪ'zKƞ@D,`W 6©&˨5= QX捧gl4({'w}]mJ6zl@WE KQ#>%Đm.(ɜʂWN*[]&ĺn 1#r!X`h@"o=CP'\ 18cv*w5rr5no-mOrw^X HgA dU2b,Q"vZ;!HB7_uEǷR+B]f:EAw(#+ D![*:k%@OZ"K'ASe^R6߳x|ZAhiWϡ-& Pd.NZYzCϥB4hM_jk:De8a6ipNh-d $ZFGB%EX)t  7|B\GӰ?Ngq);ě?|nƩs#ޓ5W؊G63ΐ4?AE7>ϋf>sJ{$%ȃC$dK?qK@qN1Lݰ 2l %x}cזFBY9:zv౴Q&2cqr"7т\op(s߯V?>a?nt[EүZMt~~1%x~E\أ)f[+.KoP0GԏuËۺ]/?Gu+b߯&y|uq٭tk("V(~Z1Ncd;zNn#kknF嗙ٚcrmegI6gj_6UZ"5,GCR7HJ9 @7_7c8Yޑ"46zbYɧUGo8RVG:dəi#aE,?~Q/h_+RMs⟋ƓQ}_7~ï߿\7{Ъ'hax9$b ͏whj[5-hZ_uڕ5?y/;p $o?$~;D}:OʡtqQ\&5?\磰껫R !Wp{!6CoiG$c#]FI_8Jrn||S 9+-puo?t0 ڢS b! Q`f?c:=I9N]ȕ`AL>~tZJ/tIt:z(QڢԜYݻ&tC 큣Q9'=7l eQ(ĬGKCs?y̜C*P;\mN̈sC$:tТ DOm-! =Jo>e61x>ZKQ32(* "ⲻ\\ Y4dy2_mx<Ǽ᪺ts?Tp&(t2E\g)m 8S 4({>NfWR)+d;d;HNv^ i΄%bRF6$E%FM2,Ok"F2 Zkd!Z#$dx:'H%c{ !"Y (,'H ()reFs))TMw&Φ|6 7Ȃwr=\(wWi]QQOO&=83΋t(|Єspi)k P fLȬcI1fJޓVQ%z{{_ӟ p\,rˑa3ܡ[K(s m#7 [oH#Ӄl`ţeG_Q9*ǻ!wB|›y{ƃW, UN(#RR%inDS(#!C*Cʒ4VЙ 12:UXM5#gޅh6t a-CZf㓥_~v[u=t;â̅w[+@LΑBckLFDa=FB4c\.X:Yε$a*8t6} 0'ǍB- FXmjܮgdz8]:{d{6'w4o+gHtl<|6x>glWglӿ|>3J|JIk̎)G9&<-9AEX/I}lx>gl<|6x>|VϾrGqg6Ɲٸ3wflܙ;sc0V/~"P/)Lg@:).7A/?x"Dy'z򽛿+t8K`l4%_d, 'Rؿ?]mOU^1E/fJ| dU{=K>ūE0ҋ.bACʒ}"+-!NCE/ ],GN>떿6XBmXZz5i~-*|s!qjVoGO|a|_Q7?0Z~ZB2=GWݨ0qf蔼3Q Ҹl.]kTQ6Fۨ~x/PIۨ~jTQ6Fۨ~omT7Z9:Q6Fۨ~o}Phu!|Q6Fۨ~omTQ6Fۨ~o,YW{S-nN }B l;F(9Yu7r;.$Mv! QmɅH.$ |pLKAC(TԎz-Fi hUn҂JT6;a#/Խeo#lYB謏@EOvH2R2WĹ.A3< \@~͎ciq1 jf? VUb^0c i'!h~ƒ62'աXg ":Ԏ7]oU_I0,))[Mb`{^R\k!Yz7A1<\ ^c2YdV#X΍W Ɂ 4s;,6t8BF8/5Cdki{2ƾ_*ˢx'N9Q)!H"Jk;ĠMrmIk(Bj˭NEd KAlKV.gr`{d')R 94B m9G0^hDUZ`22CY!0̘DbzVM5l7U ^LmWIuQ(˱l,@Ƣ -7 Z2MjsKC~-mE٫mںȆFb#J$YK*lԪ.Mó !;^8쭠BʐTdpU$b8vڜx&"^+ 3AB1v,xkz:4 F+xI٦mvDHž_8h绢|զȦʇow C`'0J> "ϖ-܅%S3u9 (˛zS30օhGvaMOw/gT"ݲs+q Hh&l­^$ }۫: dCq 6niJJdE ~)$QHY2e$8U"98u T4LPu/ F| ҝ4Y շes\T]~~,~>ar+9ɥKc. =:SʰTFd2,q-< K$-g;aHU˰fXտ:=t>g/)k0&cC 1G ]˭8-gLR;Θv1 +)gӃ4.u05|w=9/1G>)A"ȹG%R{ Ul 9۬tި 5m QK~Q%W^oʯ`o&l|<:^ Zy~xYڢ[PG?q*FY,:)&alECV!i*e҄m`f*Y2wG3 _a!F 49f0e\fc6,jg8DiƺqT;37 p 𖳂2'sB1R]xTjk$ݲɤd  02OE 7Z0 `E #ך\fxClg4  Yp`@ЦD30% 1rHG%x,OS e_ILc-Ǔz<@Is;@iC,c8!sbqmBNJ(!I%Dִ֟o7Pb@g5CC u'PT۳ d%TÈ^ۼf?s359\H2cH^P* PfU,S ΢;ɴU#1:'-\yȼ9|L Y:ߓK*%Zx}pxTi]Tv_upZ".vHG.+mu^muиVoKpwY\":/7dhH-1%rb%lZYz·"p<pJSӣ>RRZH^;?FRsJ^!+P(֓thkش`^A%?uA*#[j_W}# lQMG_A{ ͸4]K4džF²; "b0kޖ/j^/|ܐowviˁ2WKl/t~nEd|<߲L+|e.$Zx 87N&[\f{_7&+- nY0Ϋ,u> 2*Se}e=p74r Uyݾ@!jMwD=ZJxgX9i#Jt<.a֠k3J3#gt(t2mg|"qET:p&64DQ5rQ1 -@ *C1;IMN3ѫb y)jsc#W"uVF q۪'}T0Ad OzoW?X]/g̦C  &I6+o1c;Ǐbw#Y*(K(&䒲0,( xN 0s**g.Mru)MJ.%(M)z༷15IZtVy]PTFY6Wqt<Ζ{nv]ko[ɑ+>%\/$&̧M`RDl߷!R%)%R0!>XoSu`/cLx: Voq 3YvycΆ zx=iYnſcZӦYZٺf"y$ϧ/+=->t5m|u{η'Xy0o~t mW/ 76~&xw0^n[+;v|4>kSƟ!RCЗ#(6WSdå0&Eg *U/IP&sa516&O<2|UBe/M+=NԃjQJF/,76q ScmTjmRd-8oF5I H+⌈:_'rlsV(XNg{J[)Th47̂AG/S|) wZP!=ZX"ED[,F@<03g3A[cAb+"ˆ+"VD\N3_-WXTB@#4q%:jLi` S6ƢhD*8 $( X8xc$02Ȟ!;)ٌ7_C\+0\g1/Ma\4+.xN"`t 0 /IZ=c<Nk8 4p!XL"U\| \<<,;rG mkv5?ڛ0^34{?o?%>l~^ޜse[*O:4Zz=~F\fT~ ԧe#k x BI87PSnY47}KPm0?{h3_P|VP7:9W,q|γQe/u-ﶪujzp7c[!Af2?L@3Oo7/(i׎d{'|1&/i4@ʳȁsU#K$EAxZ%D9 lE:;ogOdرGg]ռ\a+7oPtS#~G6tYnFHi>+@#R R gDbNE]P߱N~#<0:z.|BI <XC#w䅈}:%D P)1Zެ?Y-s-FJDHBiYs+r`᠁\Ydrd$iW[AZ苠s ؇> 傥))H:S=I1#PN#F _taP5Ƈ D8.D|Se}J(3Y`jDzM*,fΖ(Gz4 (#}fVg=Kj}&s&s~[M[Oa lBjgNheK=b)"~`Ga\ٕ:o/vh(Xj#jk%NR^r!yRE1s6y ǣp:*i;5 6m{8+jdyR/1dĎKIhG.vֲ;F+9USIlHn#T&w}U`Ƌqd?KCrI8.#x=͌NU T;r*ɥC`"HKqbp%D R\((ekd TtR F1*,7c\k\ "8O`2Fbit/Au OQ4Ihlh8_KtM>f&ő7yṈзm%Jͧ:B%p[Q`DyRFi =Ǒ]hw&*%RQ唗Z)=w&'u<5 1)`8pS L1sP C =:hBw_8]2N)lopL\Ν.ys:>?ڎ16U*;AqqmNir!T)[КnP+5:Kk4 CGXRhOl`b=AV3{_\m0Ar¹FzHR`#8\\$9<9i"\,}Pކ%r8H5dXEXQ騔!Y%c1\I^3g~o/:QsmQu06pЀ)Em%tH7P _πz >KC!$gYr"R46RR@It߰a{O Nڧ**FLOjtXv:=8kv.Ĺc^y G@ sh"M\Q:U(8t}4:å|b' qE+$ɿ/{GCpyq;ssa : I9eCB4xx22(~ xz72bO8B |  3)8RDq&{nDpMBRBLHd[S3_SqCB E,"S㨑)ժ@AE֫轳q{̕|JuF!r;x:+̠4&ySjK9ͯK[}OJE"y4 {goll;4^],Lnّ 0t.9C1,AJg=mgPZ+}H3c_z Y}{<)Ϟ"/{{=*(bْ1οc N!bMQK;>, /@'H04,̚`tHݴ0!}wVF4~:>30|,O>llJAR^)zPp |8)١ۨt_54m509QJa45uV5Onjgy\[MlzQp@Y̑48_TsK. M˔f8AdTe?Y{l놴wۻY$`DRF YL;h8n&ٟHkd[mcoDNgC#chY+fDlTޑc (k +`a8sOo^'WO0Qg=>;y v? LK9P(8#ߎy@ײk] ߤ_[>ZPzi|o-٧WCSpus ZK] : }Wm`*qN\ e7;QCl Nq_rSє=mJy rl"IdhͬޛA렁0j`aNaeao(}F^a} X5c`60ʭJX<ِQ"m%L9c?X/v:)h;z+- :y.Ы>Lć`INFW|Aj*:kt^8$s%C*-enAyʃ*%S}kwɃ9|Ylji!zBUX(eUh΍;}u+Hn3,UJMi@t̆?}r86? 1u◼D`}eӧ u+W{S`T,5b5D4]3T\ns<%BG!: I0k 豊L 54JɔTGIVHIv@`cO5^gx+I߇;:Q¤8Z% ƣmsfs$0; jYڹR _k| dOܙN^Jӻ1eBV^H՚c(K""eҸ A>8ZVa>D?% "}EK/}f6ʐ1~oQ}n~̀B495T tLr^(S"C, a#WKxΎC}ĄY&Wш\h@_ O!EѮ&/sd\fwɴqZU&ȍF70-Q mGx<:j%.z޾atH<$_Co4DH*&thRh_d0'eofxJ rR}w&8O.u8>OP;.G\Nw?~w{%ۥS-"a^:_n4>=d¹ϖ*"U|T<1NM)bxbSh5է ͆=0oy60jniQ!2K7Z-Tau%K`oƼ5ĤN )\9;,A00 n4)0: π^_&^TM^HoLq2K)iG!~rSx\gt _s~߽QOm(,)!"L/JYh_YivBg~siUFrLA2u{6RUZ7m-U<&6 R6f\+~YHd^7MB5Oq8^&â= ;mV)ftAevэbs5fףq2"l9.KA^\ I)zF R=q lʯ,j/~DqBfg5^.{VOsޙ!BU&K<ףahujAh|J꣼\qXtwFZ>kaaYò29Ȋ2]G}ymQ@/x LqY̙C:<NcʠBQtʹlXZdyPi%Zɘ\ ѓ99isЪﴌ?yM\NV0#93Z#ɽWDGNd\k%!ڦFVgz_;,Q*Xfw/>Fw6#v} ]RM>CGse߆I. |sҶ>0}$ܿq]MvKNE7h#v [ա&k0t3Wn8.l|MnosŵX<~x(%Lr6T,/a+ KrY*ֳQ_;F}&2/Lh# Vhwm8eﮪm 0٪٫pJL<%dg[c;m[q:8ZnJ1GVފ Мz}ĬsXb!rQչC*lTI"դ(CL+/(+uJWN|]dbjQG:js"8Rǡ>S2[Dd># MMF% ԁcBPAJJVŭT;FHɴ#ܱXOVOǃJJLZ`BB `t[:O jXN[2PQՋeZڣ _M>:m/lҪгo G=)O KC!^[FkΨFΡj..EɢsF/=[w\ԮJ,x̽du~c{=ܙ-Yշʡ#5D2IzrLH.jrԠ \\>a],罎U,xq` )R%aYkƢUhTEq(%YrN[U] i`CM|Zn h-; :6PJet^_Y++y|8w5>wv6<uqiW?zNej1޼oQ[D O+`Au1(f-{ٗ!͕>ij ^M.8-|lq,$@)76X@p[_/>Gͪ9Ͳ4Yymk੸ܐz݈u"@ݢ{<.J CQd}lcz<]K|eSsljo[rg*kWssZ{z{ʏ}+~<9~~Yrv0rT`Voޮ @췓㾆鋷qvzs<6[GwHۆ1ۇ ۇ9,/ =FlwF/?ypxp|Djգn&mmsԟ!gٶpxc#m`'/dڇCQtWv0[]䃣/__~n___/km~/?ʪǜ[m"8ugq7|1+C -m7Ր^kq45wh&|qV~:XqUo=ϫxt5I-f~:XN#zzQ*c/BH8p{"9c\#g_Wpp:|RQ'򖄓E$J(?lĒ;ҡXr4 u|ŨÎG$ <5]+CSY2v]|UHk#ո3ΒOv%n0U2ΠN?VUWn`[s.L]«)Gh&l! dG:@nS(RW *]OD&:Fi4zz M=h]tS;hӵ47֦kR:RǡT^8soCٿi*ՁH!uvɚءVw\7ѻ5Ev;w}W"v`#Q=8X9ܿz|8unAx,wNvPrBE'{9wڝ;ߝjV* |AJԵzԨjqT *$,̮%(ި0.]L !{Y`*.#@5՛b<:peF#ܸawrHb@ZcQGˑbu U{c\F+4Ԭ.?8[< b>?ێ^fn~]}U ~Y>g ʺkgPܭ.>&ǮtnX|qUW5N$F~\[5[5 Z; H,Q1`T` mV; Gtx,Xuv 5PtbSZ؝'Z2֕R$q0BɌ₶](q'k\~Od>3 '2\q,Wme{hG;=mƞ'0m$f䫏WKKT_~_ɏ?l:3.3y~֝)BZץꎦQxD-x; r|0]%>,㬤S}qֶ&{ ʽ5vÜ<9-N9O 8})NVz{/{Xp2Oou[l.gع,})w@9?qyyoaah;t*4$@o+ B'ޜb-.U斲|m10Ab:$\YvW.%+(YՈƆ Hh"X;3Y[!1ɀ* C'I|o5ȿBn"cS9H-N f>b§dT:%B Zs5wE锸<hv=Nv7u{?җM~(H;72 b=0Gf) $|D@0IH6q;zGBmԂ{`9u@\@:ByR t$Ņ* WeRW:)Rd~ͷ<e Eג:c-)ҢZ5x-,hI(B@|ѓ 3!EdtQ8 Th =lqziy!hm +` äǃD0DUmu,j .u&=AL׭e* CW㛟FNS4Yx%1/`OxB)F{5ȍH?~ҷLMj2g!}J=n0,ėJxMsگ3q)1T$G9S(WYH^3PA$#&}Cb &I)$G}uˉ4x,K,wHciǘzu#Jfj櫵;ވW/4P.T~R020Z!dw?-LFC3r|M~w{ g.ϐ3~},~!R2$anij.9*^u/P$)dRHLuYԼ&kwdbhճM&r9lmb5iUUSI~f"MJ&f B]&|ڏbr8F5[Bk C0msw[^c~i;a4|((=bеRأmrj RA^s?Oqa??azE T\J7.HMV-e");gbJI+; =yfñ]a/uQ&胝QZ5.`#놣⦋QT5ro. ;^ޅWnyojX;2`´Vq%v W~; 8Dw>/pS:x5ζ+"p2aWC u aUt+Qxk'D Lr|rϛ)f2[{.y.ZVM?},|ngo".?AY^|_wZg_qgŻ+=o)yz$LKg?~ZF~?">s{8Ć#;Î/b-`%۟,Ju쿢-._]F~\;U!.ד-^՟ofJ8]{٤qfJZopB%?ѲMu4?N-3G*4-vh*VvKfmIWfVixd-jZ7ol{{W5Z &o#1SLK:7;ڕmXx1_۳Jؤ7݄oznrrӥjm srIżs74(ʵ>9\eIJEa[t^Lt~b(i'iR"hRS ,26Oˀn68Uz+zsJRY6ygwP.1(_'A^z>?~ru5/K5Cd'4|4k_nuXj{z_rb9Ŀ xuw)˙t7nҕd1ςHYh"WRV[N%k]^gVòc'#ݖZv>m/Ṙ1l&c6-p3qG5}$:'}C9YG*]Y")KFDTZ 8JWQqt}c偂i KHIxp\D>&H+p--:֊ d( lRіKb1JXMM!RsD Dy [#̕ oSc%6Ҷ糛dض}mݗjZ7^ pϑ>ݛitevf.z2]2[{ܩL֮ڼS0!ק ;jau[WC!ηW{v]J]@+k-wl2!oЉ;<݊"ΡFFt|Nwj5_5Z}k,HfLͷs4 oe7萷=~6i>;Z=0c⡟ߖ9VZQէ`/QIP&sn526*m<2|ջjYtO{ۥ^C}!y|X\O|%5zybWnl8)zjwqQ珗q!)6 }RD@BtN h ୎(!uT HG4߅h>I\NѾD- i)SHQNN%U\()JQT-F%NHpM&d-P6qUb$e"*#ZX\tH2BdBtJ`@I*E 6!HFbGr\CPBc+Y\gfܒkhbNj4_흿lPh1M"(وȓUYm7!DICtc"2' -cĕ5Oݐ=α9+.$LP };20L&QG)u#Úy*]:EmUU=xT6ĔxТ 9PRp!1)=1BX{t\,ZI|4oT;-(̐z5D x"^"qumbJ@<:)la<,~{x.qhBEqx*xX:x{0 &U]Ap}E?Jipc &O N&=GAV|pşE%0>kD/p 8Q? 傥))H:NI0e@Ql/$#(f &N &$2#EjZmm"0.N0șMgrW:o;wIx_m&c׋˦jt(six&Ƣګy&SZKEFF HeApY#6r:᭶&H]Ҟ-U1%G'P˳˻ǐkI̖YN1V9Ɗz[ޞMs bEbzD|8_I*RHSf*$r;7 ƎG9So\}Ĝ };Pv_Mlu9QǹufH'wOqI_Y/"q:Y>;b۰^'j!dz`Za#ԏ̬_feeQFie&㈮D;q iQ唗Z)=2&$䤎GF0XCrB8)fk46to$ZTq~X[,3%)ΉZky\hj:u(-n?>[q^|<).jAS<RL4p ZjF_'xiMFhR wۚ.N.4ӳ`s襲Gp$H{pAv7NY,W3ijea giR, D'#Y"5vYW5uxzХ&*`T09_%z 4M r`Q` $uA:B#Tm"sCNKC!$,9 I) ) ˆ$:#l^ͧ:IuP5'}I/gŐ8w+OuN0.8Qp! &UYg8Z18^MdQZzdޓ EY? RiIxvhdD3-QqtUi[{m:e>~o&"U>ݲ-\4  8QJN  kh2D|$Q$3.w zK0Wu][jiY;m~[2>RV{'w\Z{RqHHM^&**9zxщ9߄Q}\|3Ks >E? Gz_\ë)oA5pMAi~fOۋC,DJhuU Xy%ew ![]oT`UC Tمf4T f's9Fb4wVU Qӱ t6+ܝqTh58nTn hlV*xnv:@R=@hu(!Q7Y=zqtx-eSB)higi TV.5vY~3oAz4rWY1T1lJYq Uiq}i2|[qϘȰלXDI 3NX OAR.tN* x+-BH 2FiYmreLweh iNò?K9ߣ_mfXnh8>h%8Zcf G( 7FE҂kʨ% L|_ğB$&x8׷aPV4 /e}F^ TP#s=ĭDKe6rH@Ɲ>M4OZ%r`+w*}"5b2ˌ\6owȀ[zGT!vsG6o/{hDP`qhJ\uU) }ȤqSZ^(%N+ ;{5B6=fyoD%a 6'|m9ta'-(uLJk3OcE !PNFB2v޹2J$F (̊h ]-vM˵zw#5WPuoc%>iaa#ur)\ІH)Ć(-o5&Hr`*/9jC״n @6 Lb)m b}"WgqPL<\|bɕw7aX`x5mQ'yxshe{iǼ'-f?f3Vhz)Pky6P5Xw4:d*0 a`-vn ֫(6DȪK3[JJߢR-AGmAh|J+k^cHtCJ 2Fp4,",kyvBvsĥ95@#ƀf_8 gRXfA`㨀|g|zcRhɜq;zp?O}Oe{m^0k_ݛW_Ӝ{O EagQV Юڡg|)餍p4VEmK'?^E4&liL>l L0NJC2‡!'& Df~ضi)%~;ɘ#S솬WGIQ(\KXzCء6ёjc 2>tҤTDwVtC6)w` 8\4 JݯŎsf.xAh)uZ0#u>%)Bw>av&}qx;9Fy,;MaW>|{BQ90LـQHGICPtLB"wa'!\;0 Yz(!'@MR\-`ZqŝQz,ufV6 \pn|Cz9F$ϚfہO|5#[ g6<'ad.h$hLHAQ%]NQtRR -3<hr:Skz:A&b0Lss(UMB)) +C1;.A)BqI~%c"^478.ohBcVc4hR7QNA4 $i1(7ͳ&sxlA󒬆"Y?BbEzVe)o5ķ'#-C-uMfMΆl4լu^Ӯ"=jOb*QL9k㾼Uy?o"G]ԾCpvVd en$ yh]QѬ6 fKwXwxm+;5LعRϕأǷG[PR̽T#ў!̙̹gn-WmZLF҂~u]ekY߇?99ݿc-~{W=+ pJn)Jien\ - 2Ϟe%G1Q'?Ň.~/~5W/j? |-HN>0W*^8wzVK[&M@֟"Tܛ2s`gv-Vό 0aZ|lxTE^Ƥ?nc;.k67[# _ h~p@5bg1>%pز!EJD5%vWTUU]]6'%5FrJPd!RE#J-3@B]L_ApǍ_/1Mhaw7F3ó0AEz#tb0~: ߆w{gǿLXf^9`yI饃ߤ!BU1G_pJh&޿fzgn`^=)wZtuU0I8?3o.-I J ͎ד#'E'Zݱ/F+ܼcwY+` ^LRsOy ϩ1Mi+&|$4^+(j!խmjMn ;7n봹q$/m7Jdֳ7ܘfЯi obb՘M@w&kVt46Mܙwzg|N ;ʟt03LM SeZ*ʴHsg=F ER,{՞o/3SD(@wlLAL= KɽbJ(lb#W@3zu}I%tܫrxcr_.{V/2½U{[{zy4aIڤ`'2k#痍e%,*zV:$h\*4l\+cђ?C+-U H$S,s2 `c31aLHf dE2$u[X1 )Lcz-#hj4BZ"2#llx{ɪgv ׋Ys6Tmp~c .S[*AkZXk}`=L(.6ut2=oOҙ^a0- SǢ jo>?nIi5s3hXŵa0'or-y~էQQKV]P,ӷ}0JB o'OSTA4UP?46LX%"(DTc@sb3\jMY ?hbqފTܷbl t>oQg1:3.ws&\isqմNK@Q}45q9 4Zڒ-Jm|,SށM+[GUrd\f!#MMqlggiN]wg rQxᾛS뀵F5z|E1mZ woe~l~}?D'ڿ(W[Ƒf(S1HmMǙvŶ-Ƕ w_3vU.K 0(Eb![ZcI"!t D l5Ճj"„1Bp:`!Rd3I ,1`oJqTQܦڻ:2,"`sG'#EbD G}197njoǮ=]kl=v-|$|QƏQ֡]Y4g?8ؼAᇰNRx:OоΦ^έhɏ:us3ɱ"A+aRyDrKJS x y, lDe />r:!tHx80#2%Ĝs0R!3*͙7s*!  ]QQxqz!,$S XĤ:`)'F!a6>ٗ{s'߁zR(Y)q_1Heߨ0lW*3:LRTTEjFd>Z7()&zSNqeW12J`A @0f"(ܚ195c>MfmuȬ E w^ezjt*7V'|CiT(0@ADeFH+.A`, !"c{CdFӐ= ֒{* %^FJL:w:r^c~ϘqǶZ[fڲ-]iƇs`SHpHsn<v0Y^>r!c2 Cv@!$GQI:qɶzQg֋Ջ^\ =@ N3BXl- *LQd(Q)&J.(s6*KMW<r%qFk"0¥`LcBL1jve@Rk R56r6֜DfNVf%J:.x#HitL*"Q* J3K`@s$cPg(3Nj쑰֍T8+ݔy|0Uϳ.ɕTg{}<"(bdU+0s 0EpϤSwh*\a5%xqlM~ ΊNuYq.*,&7SrLHǝi59Og79I3K0'jMt||4~T'hԭFwA +C.a8O5k(Us15wۓW^^\ΛNfb`Ŕ ӋxnGZQ e=I'tYfy C_먔D~{5Egbp "K{%huA.uXAkS=01?]VbQZ0 L+YAo߂Q3zOoI[lࡪ͡9KgmDZoF7 U&M2}Ԝz.EǸ.q1^ Œ9K˭.#)26|Kz)MLryå O=cF@-y>L(RFXp4֏|O?R<HxQsMl{קsœL,`?{WFJfJ;K<%(CRF*R'KE ݶ,V2+*+"8zDPKքwͅcYFytt<"p25]` w;tpq%m wеNC$l~܌|L9]`r.#sHk/ȞYȵ1J ?6hPЋI]/ΪdS:ggOg1%FhU;#g߉^Oy^"ld6ESCs][OdY CMAR*&Mun#I&$&QKZ™${%Mݘ6 {_k<~ J(d0\F(pAW"PG%KznXI2!mf6R ;5υdh))4E%r"g3m hSҕ繑yz1Ij fd0VN` {4$WALWJKwԏBǎ{PZ$ UNnv~Wp ̹j&wYrOy 4`Iwv73v;),t",@G4d҂{cw=чˑ++RR%J! 1A{'`z!E^GzbrH  :8^]͐ia#mz= vWrc]kB6n=:zt$(ӏ,y'UQ]ޥ$EeJ[߃O=>)WzoXg~Vk=.TX 1'ǘ7[)OƬ 9۬p^ݦ6[})6V z? q2_[uO>x;cQI"y0\32$E%DKQLR!I7zБF(obΘz Zk)Yk0Q (D!B oup|ï~#}V^ |8"j|T[z~4wEx7xngxLҪ,5 }!h4ۘfZq*,(Ȕb@FAxQ<8g5˃<4RC ] Q^^W4~ܯJ X\1\մyUO<ԁ.mћ;Bgl);l&[p' Qjh rY'Z"ſӊOO3CE29\Iz`g1j%Z?%a8>mHN;4̋xptLܹ  m&@l1lcPҰfg^ݼe FxwqxzP)7){_7yIQۇaq4ζ>}~Ҽ3mkM`<*lO(j@QS[*)Pf??~p؊H|v뗫~_/Qs\ta:xx!g7=J/*Uw_DLD~B (^g8c˂뫕lo݇CF_q_rR?_L(w#=geUE=". (Գ]\Q'XW/򷟧g A8*j&= ]8!+FgU{o?]on]?ގr}UD䟶 ^m<)G{A_4L57G2 \Q˰CPf;ijh %ףώF s^']0FąvIԷYTY@jklUOcvXpaŬ57Z )wu\1 ]x5z}6P#{_> %BP85ƙFLamV*$FVO8j{_:q,Zȼh:8~f4Lk eFRXPeDgT4ىs+N^^7U5zë.Wtpup2>Ki0n`%ʱ>Vt1 !W}UM7!יѳ7]yӧgCbiѥؓ>>Tm/ݜBHq5sǟ,ܨ =WԼ+Wҿ(Jo=|x]tV ZAW3\8栚 &M-o[Q#^:q+Tf5-]v7óhWZGj5/ijh!ޠz]XgSWe"R؛՗ׯ\\ K-IL(Ӫ:^/|w?UP0@8- TcA1y $[E%|lDP $%&U_S`W5`k >x~,4^zaLI!OkC<'x93J:n\JJf^ nkz`༷ >Vg&rDjYUiױU9[f6yذ{6av~d^Ϲ^xvȵ5Q:2qňg<1WH G4de4iFwxJռu>qmNnyo|9HWFxfͧz#߮%oa_8TSd/nWf+jc=QN3|\v(ޭ[zKW xᦿ)jU&2? Wv|k  U&*e4YCa=KTuqTsˋ\"5%~($ls2]cߗՒ-M۽ØjW;Z񵛎Rz;J(PInQnT{ ^V?e,~|.%`w5vL-WJl20NC +׸S T^]ݚ䙳stO[~x?#_ )ͨH$O1!0#5! "D{ʀ(t ׋~SQYR'y@mU ,jeIC\Ilz!%BHu),/=jl2d yf1Hp4( PCk%ڣQΚ)wn8qSMb=DnmC'@;O^8pG#\p/Pdy]>|B8_dJ@bz B-dЂ@y$^toSt OhM JrTyhjQ0iidJH#W=2[݃͜ҠUDf."LJi]2&t)jx@pٻ6rdW mRl&{f]`39KDI`).l˖eږ6f,Uů*NA1)nYJfA]& .VzyD$$FH%JfcrȹYPpV&BRj|+Iv4P0?WB.j jq_r JDovecB/J@ 59"`S\hU&eGc9' LJfF͚V qƶPWօӅuyjzZ<+|`~g4~0}.쟹fHId o].J6% &C%TJR!hHFd2vΑ2};97k0s&hjܱ6v`x%0g,44IBBȜR٤Npg52]ٶJUB&dȁHIȢ(l-HX$z+jܬ[~(]шc[h+kDiN#^P mB4eh<03Vd8_ID&Gid]ֆd@gB jG#H ;Q:sF|4Ձdd|ոd[*EN/^k1:"Aْ*F!$TZxJ d"uz>jܱ><QCwk}x*bֶ' ]E?RVя +" 7 lzc&ƍی)Ardsȸ'1 F-f6!fSna[-Uxkczqr8s{QYR7JV-VMjApy{Mvy(C?(oum ʺb]Iqy#!-͸pf9 Gɴp  QC&C$`%Xha>, V%iJ0iOR&0I@ٝ#:,,Y#3XG! ]*9.jgPsmW:[FX[s"Aǂ>y'ڣwCI7x ۝<]StΣtGu. h"s!YJ;HH2 ʤ_y41TRHٍe6^6A h΄ЩA'WWWҁt&%CxsJl!9#$dS m9q oK,?+B<$\HbV@}RE/oG-Tb26j?F2-L_*]:GG׽…‚1ͫUh)Ǝ'?̊[8}wX ܪ]Z6kem!jI!Y L!D ,M R{@Q5VhhiF+4_IRz1.Kc]-lIzGW-V$[hPIuŔ3 3x=$ L^C\k!]p`pDX9(%dN#&(IepcjlPt:H}Z_(_Vdž֔~_xдoߥ::$]ڥri(._N9Ȥ%g#Z3^Gǽ!A;ΰ\$b'$wwr\ҒI(èG咕 49/qe2P dbre/B;j. r &mȴJ 8V,0-"HbrV 嬕o@ZF^Ǩt!dip&L& S>e] -2@rdZL'5Es@O?2k!eWxE KY;\a `iP2Z +]ÚMA8Nz\v[az[ I G,eD$#mNr!eD%Kytȳ4܀*NqTQJjLB1,@t ( F+fU$ zDwmclJɮ_5Vle[a,Y̢֒AfT,u%fh}ƲF\H6x'c ƚ]]CWY:=5H>,|vHpE@B HQYňe!ggJĉ}借i)5]3?+EB9QX3p6HmjcTj˭]䮇1@Oj<(+w%Ib6dNVeA+'-‰FblLH e \$8[OP*Դ)WdɱMZlOd;FYLjSZ|oCP"F'F΂PdeXFEI|0׎@ 2G?t`},S:EA- cr5$*`@ߜ{'< U=)x4얰>7`#'v|E+`N3K/O/feN$%KW']0&SrR,d !q ;4ZJ.㞟L8 Mpſd#Mn,xd4Df,^MnY䦯4 x*. )/D}N|HD#ZMYFD)!K>SWuMYF-6ƏVҞO^@sitoxҟ_;7^N^gWC㬗/afc+؞s ?Y_rp8[ג1k[uuKq9,oIG,XVpu1Г65#Y۪`׷VT ?S9p2RV>cyWm}{YFKz5/f/g{rD{/)o_?~=x__:?L[O:3 p.5?EӦVޢiia5oҮ5^85hq$_^ /I,ŝ.-xjaWlYhEg+_!Rm1D=ڷHМwݷ)!sq[o=rnsARdJ1rH^rdKiLFAHB"dBb hK^6\wCi GcBMx`4&[1 rNh%2/D^;Y}*K.a#zY͐yW%Y[im,p?b,p)A%vMu=;hX ;:4.@J;7!к2-?\wĭ5:^]0{tYAy^JĘ 1p.G7Y)II3RIi1#wދͤOXmǍ#Nj>hg 粻+(#O$XC@)ZL2"b"H-A0BnP,[a0ygLD\@& 0Qq4(\24V[ sA>qw6XBnklŷJ?ɇR`sw`v7wIh>2𘐄V% @5Պ.EST0RTNQa_WԺ[s-!jy /f6L,{[jnd{0V[fwRݧ^jqa-z}yVbE-H1)w$O(ֳi;Oy?zѼ\B vJ& jdr\ZHPj=('I!TgY8ŞgAtRBAˉ2V1*#ba\$2ڀU, QR;",DP::|^h|ܤ٭vk@,&D bSn08?bjeN~ʀIojyLLUjaU&Z$Fk瓭%%K!T^hVҦ0=#}~[UǛ֛O ˼3quB39VB&uA\zGw^qHNhgy$an8QAND83mU#EG_&rHK!Q/ioK(_Z?~1V gG5v:iRs1?y (b.^N')f< Tsvƿ_LhNn*ۂRVV*T6R3l1sAZܾjcvqb^ uѻ<"q݅=P x[XH94gDL?-cro՗&7Ri(NI%5 9M4dZ ,8bZ6"$M5xgl6p+T&FP48 4:FC6\d{ru\頄W"jfNh(U)`bƒz~31IǶ >~N-41;A+*|LL@rEVkktHhVI*7 RVt"D A1`&($TDq95F5>+ź~/' b1MaD4D< Jg0-5NzBW>8bD3m(A L;he@4¦mgnFѸzowp~"oI5Ž/nYc 8=RBU$!ϕv ;=%P.[o.#Odm~lhvcF>-x+'0É̘`.i>Ϭ)XJ˥Q$G8e5 x&hLSHB ;IqQ[%4/֝ 5gr73|*:r.fI786qrؒw-&aTwS\kn#B/y-7Zt8Κa!HO( F06rs65:o>'E&%,YM\LkS0-NPD2V`TFCEjCsI!BzҨu`pVa*yΡSTRI*BQ8c2Y%Pa]iiEIMiTR97Jq"ur DISN`A"EdtE*H,}y7LP#:fc%5?,$$;jyí$QM L XUGYc⒯;qѬ?޵q$vQmr d{\0Q-1HFU(k$ZI4#3_Uc8J S0 is X,"%{.@q$Iύ T&- 3N|̬{4ҕKȓ (-)8g\2{v˸g3<ٜ.\[& =ˍ%DȌ-{tN2ouEp/"~R^gm{_ fhfY~\uyo~|^Y(Z)!nܨ GSѿգMS+wJsokԴ,3#}u.Pxq:?j.\gCWq s+^2R|:3jrG8kɘ֑@]?q0}>1xR˨ɢ{'rgcN,[\;`7|ȶQ۞ՕJQg ߏ#wN򙠝'ZeZQ3I3VH-!bu{Rɋ`w9=Jzm6#R,s;_O'yn Zj?攙zEY>f)5ZW^n ':P!KyV<U^jV "q y?*䝪MZkbROAKm,")FI=! ZF&NU O ۲m.{/0m#]HY B]HA@20iN>9F.?<5/=ϕKoV$Ĭ-O{Av c{c;z-0v,I.>^5 $>Oų*+4TVEMoH')~}jgn[iyAoM C%esSX̔FI$+pgZ \i/L0Q[c8F"-J.ǹ A8#9kfg܎( ݢ-wanr],/}A⣯&?0[eCw?+IȤ ,H֊$::%]'EjgHutIF擈=vƹi n}r^rIKVr9`R`]ˑrt,SmL&v[ڳRhGDn viaǤQdeliDNvg!m]v2 noMt\WhIIS1,Z<l h4 s"hiYK8{*G^qH+A!w)k瑋! d,a'd>g%ѻژVgP+G۴Y'=.~@ц)KD4h]HQIG{M[5mk\\XrQ9yZyF8ȸ͈tZb@>ՄQ@ɻa|۸TWY]ɾҽS{w6[|($]*@Bc圳\Q[JT(![whsh {|9"r漖w|sKYyCܢ,v뮏J.I젫\QW/TʆPR.*IIyk+WA \`^q+@0$40]dUB<ØAńY)]c< X <=29Ĵ^H۔OEvaoG-z~`)_~ʸToM=F?˿ykb͵'n=:z$t$5s,;:<"}!xfپԂ2 VBIے"`jgmNԯh#oH۹s=.Lg$`,mNVsN)S)b.k#}0*%fO-Ԣޔ a%f+|f; Hw,ˍFE`$8iB !)H XJODH"m`xNAom"@ F[IKTjo8$5 C68K%}8+=GMKw-c7wӜPDwfd@9a TH)f%K9n`"91M\kLre53ΏR/zZEO/yaY}b.Q(')ǃL @XMQѻ"zA+w0ZE'c%WxE3%mG7q^Gώ~LN%E/p~ws\T9qqj~$c\N fLӲVR3 y=5Zwj6z{r!ix$Y,SLPM lTڤ͘bX(jȟ0Roӭ咒_ª[mjyFl}!XD.Ay2\Zocr0s9VJhFǒP[flo}Ĉ>2o!::2%iR,XmV1"deFmHYJDn2\^'BڸS/O{N­U[WF\ؓ(G. Ǻ!s|:۫1x2ܻgZ{8_!egi^Nl`xf? E%9 n(6٢ZC"luߺֹTc#f"ethYPFqFR Ju۩KO?vmwjHxwWl-z_[PW6Wꞷ\oBX]y^!s{?~ӹ;~7F'gC=sֲd2OZݗ"u;4FF'Ώ<~ fƘfݞvg $RCIGS~qz}.!OT4Аؒ&J?oP[;#ډN_lAimYs_r@K`${҉76~LYZQrs eHч8Q械)\kw;t=y(:#G/泳>6g퟉Ս_nx`RN7Ѩ%s.[/[={×'"m״n1LJ^ Kx畭+ի' ?[Ӥv{1asu+b6sշ']yvFߺ-o~Kr=gnzϹ\5u 7f=b^6nhMͿ6W 6jӫwgUDRy~а׷8[7 \ Q%hadr<ּ5L.~=r(ai*d-}ycjcn㢷Get#/|߶yZc|vҹ. Z_t%0IWIA:NpItgz-d}@YL?ƌ[Տo^}{J&Α%EUs 4}DXQ<ECEMVx }VZkƙ\kqs>A ? "V$t_߭IZ߯`W87`wB \`g>~~nDzjr;o!6Z(F ނaO,[1^5٢>9-@uߦ[-Zs F5kwHa5]<.4>\5 =!ľr!DSNi{ Dn~j. 3W3ktqş[kO[Z$rzQax`}=9Z_ /Wr2+޼FdRD_ǿOC@/Ts_7ԅ `Uo܉fqc;Rv|35?Ai%rx=L7-@i QSblF6d1k Y]sT 'a9h[#YF䗺rЀq/YXԠs~ E!) lRim-ޟct{vKe:z^(ovH8@{Bl݆$b^I~wC:mY#ݐ[tRrv#V k[<nAsBf3f韣?l|ʺ~>6n3>R>h=dmho_B9m,;q[#Ku3;{^j6@I~t f3j<Ͽ}V<)wISbl=^pJFϑ} YK5,-Ei%pNVu:yzar#R=b`^ #;S\?!lѣ&wj =eer -P,e GSd|fO#)yG>A &]KsiBT;٢\O򇺣ұmp<C?\s O2p:z"\9WV=sW~V套KL<(eXlks;v蓗',/G猞o7(wS7,|k|:}xz1 Փʪuj7YZYn䷳}iо# Pv߼:t~:EYw8jr~u-'> r rܫ+ƹMiKF=a}Gӧu]Z-on"_|]Gͽ-?%E<2xQ%MX)蠸gJWN0I9QϷ)'mnwp(%ḮIz_NaIx_"v9+!HryﴖLXcZ;&;ktN]No:˥R)MQsf(2LJ0) 99Xq}%>tȄ:3X YT!RR=J8E,[WbiÄ#qww;d Z !qk*F:$ l bMOnԣUd`62B #1GG{5_Χ>MfXJ%jn. P \Y⧱xbL$n8nkܰ i|8peyBYβ5@8`/P=|-zHY -0Ni,!PR P $KBX7jaRS.tH$ 9X'.FĒG+Lgfj^|&zL0YĴO2P8`TrƋ@ VdQ]Rua2qA DP+'# ƒ - )`$\x`fZCꂏ qX%1iFM &JQ`yUNՕxȠ-.B #ѷ Ng,:bKYGEʸZBoȊ$wE#ɐ( )DKYg6`#ą^c2(JF5LT -"2 hI;ț*aE}Eg͘ [E%dF d*' ϊH8 dCQR/hmq%q7+ B(qϋ `,S3d+KU;(&riA h6AwCBP0`Ɛ)* 0c+A$ 9d&X @ q]h"<[Э!5}8G`p.\0&R&J vRC% ~cn ǭ`I5 pUB) w%JE5XQ=[WG"KQ @ dJ<AjTf{KIFGe] %uB `=/dO5"($K64KJ1,ir $Z0aƠ }l9u@~"iQ@El[Zb@^\1+P$8_UebvNCS0%a8֋XN҂kVPN=(YbXnSj'!px YK8 )>n9#SV: .zSU40<&A p~cQ A(!Bd%B HAPHZ\dqqdxAY-RZ[Qx#(nAe6 &}BՙQ,TG,CG+C)I6CM ,GB0NB+!}/O,d+cCX;Xz+P'3 ':kc%5F˯:kt)00F8_V(+aQB%IHJ0bZb^O+uUkVt.#Pc6ij%p1`NebV\OvVV %XU(TLN06\N&D5W '0;M|9(cfƲ2&` D)G zc-!=U,Fp̲B :";V:VօH~Sw"Э1>D 8V=zr6 "q% ]a$sA`p! v{e4F22K-UY6h0d!Qx"6 B,^)@@[Jy9yJ@t(+$;`v; ^%q\d֧va[ p4[р~ Fӣ eR>Ap Ԁ 8Q۰ ~?dZrR vI莰-JF/&IK:M4 gNWcR J.,83(b6l(ӧs %p2}KI[Jާ0,r# GI.u3}1m ӧ4w!kx6r7@w\*w%jsF Z {NL .!>L .φ @P˵sa @ 2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@/ CsbA}>L {|@P>j52^"RL d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&eisbi)y>LR]l@P[A KL @B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 Lhsbugt&TW>&-sGZZ<%2 ȁ@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 LZxz~T.>'/o߀vbd>N*; G~tAO>vu9)23BN௩.ra<ɕ.E*Ld-/5}N("0͛V9|aB_s5|sp4Eo%eNBRBIq+E1Df7܈@l6?^P'ss+OiY0M+@gL/)mqk0tS&)]ZvA'K%9z6)L?}yI)H8mhu# iئdq)J67 S_]&4?W>(w؏]H|k$kl&QlLtopb7*?v>Ka6Րo, ăFC,"TVԙ.[^_uRITʀvj38YWؓm "ń֜d) 럿/nzJ5*.&hxsKt}o4{}tkv]nZ-U[hNE[^j~IaI.Cf  " Z|n?1Ji5ceULΪ. O$J=mZ ٓ+Z R#KHlz99Ctn[)^>!i|\M Wl4x!Nb$6VKcAV{"Z>])]$(1`J)98e.Nj>p1ZU%wiw5;ƫMLW'T$vR2-ݶyc(ЍS{ 71Oq>Z|FRCH{V{;0]@8ك8QVo]~[ph/AMҵO]a~U+k56c(_$fV&[p)^]K =Eu.|<6iͩ2ֹ25`>\`I,+ehpm »?cw}?6oǓ/~i, -vsŋkmĻ[|7n7-_ֶEInk]ߕ bU{7=jn\MCr!,[p{ <ބF{oQ1|/a>mӏ;5i^Ww]Œb?3B jl[>J$ @ZhHk.}66j7~IFĂ0ʉbxYC2U ^XX ֽ;>f+LX}X cuoۇ鰙]ф["IhJ6䕕)s.> T1@4y"@ 8@lS?um^r;I/+kDՔ؎MN42[!'u?v| |FWLQmH[>(|cb d|1Η:96 z4GJ<8%lbeNhM J2-/}³)-ҺjQg܍:;`d ],lZffx8@E J DHnz]bB"cPE IeCBL0YUQklWQQUnEMoD1R~8vOM\~z~n9ؔEZn ٔY_~Gd{ԍydzp))^uɕ"(v)ڛ pT)& %6^*fmvNP_v|"4bFΎAUXcO㊖ZuJ5[XM2-Th jT6d|J͒s8.Ԇ٨M9h$,ˎm({%IebLEJ8 b~ijKgP=3ΊJrĔ٨+[̄Mª\Y ^U-Zy,Vtjuej#ݙy.%K̔ J鬉ѱr#)!rl.x#!eU!ؚHP,1GQpA%e9]hlUn{i:T XjqE4-Aq{|I|1^%-58zO=+1ɰLC*jCwKEqREX:w[ ڃ]lr\g5)9.vѢ]DK9U v;EQ$ 1/$8Y@# !ǰbIǁp=㣆=\wo{&lw;ylX?TG;ձOH'~d"ُ(l,6UHpBil SM7@'D娇 nwW l#DfXBGC߹-K) '8kԹg@VTr*XQĠMj,)%M .[Ms4,*.UvI=wI~cٓoQPBlBԡra+Yg^}Y'7n ]GҨS0ѺȻ0cBdSt@w<ϱ@ˡ R_=˿TlJLfYY(9H$y#lͧec-jUZa6:&''=-~;n ޹`O1sYЁ L<$ɥ"4D3T eb84_ MLl(EՄUyw#བ9 HIBUDkN@&&VfZ{.\bԶz3"P4?(UCTVen>DC<٤'"Ad0$leuE. *)_N\1գzQIȡa+k}8Fۓq{Y}&jtk&J$J2!M(ل+o*9[izvJ (TASh 27d8qDX9@0Z1zzŽڹ)])iq&Ϫ?^D"qfLP#1ﲖzh;{ k&(:cNJj?&Yɒ+7JDodR9%`WW3.SjKɠ?b?4 B9+3"ol0r&$b̻H{!G:B3TWHIox4)'_#NѤ;>t!=Wm`z"K2S<봳Nz;s9??޾I??>wޞ~uzǷa;$TWL/#=;CZ654W M`h|r765]>Zpzvlo-/t.ʺܠ'9nvMЊ; t%qd2roSE\"Dم$o\ +uE4ql_\QcNI5zoRTN9BaoA;SŝB+B(=šywffu`A1o -xwI@i4b̔3FQiıW1 ƽn=t\Zljazs;I0(":7A l^yPȗUW{?%tw>H4Vwi&%rON޿Rn+۸nX0{  pFp4$N!`P$m<i1g}ЎUצĻ<2'a76YGH(E=wJÃ& K6чd@xڈm)aqZ냢1T)b+t@HmR^ w,l[,*[ kaZc,Cdoyf3]v}ؐ`m(r%0X;̲o$r{ս|נD2̃.tst& bÜNp1dd+mE FZx)GKWoTQٹBB,[JL.0" J5:qUlPmqfaOIG3A}Lp5(*浤ؑT(~ƒE*OAy Қ)PFA/p8%b"`c1g;Ja8Ұ1r@&=c-*\- tU gY>U3<5ԤӸY1DC!)1RX(CҠ6*bƠ$"ւa-KXwT>MIO3dǵ Ck,~DL';o`EPi6e9tP+nq qրc`t09sH[GS beP2lƵ|Ǎ>s# !.PHx3jAt"iZ+Io#r (WO/|g7UX_?t!3%ylB'zJ`<w+T~fAG{z~.F ga&r]u&7YZjlxH3+NIqoIqh'iWy;?M <dQ%'0qȧ <x ,@K-B@) Cxd!RԔ1тFQ4`pHnT9qp4zO&ۀh8\D}ashe3_\NΧm i}>85aDvhr%5~uQ`('R).+@4Г׬iVYE 62( F刊`) NP>:tfYG$֝{kgsy:Ŵm^jOYj5o\,Dq>=a;>*ލ3vxр I(ȤCάAPh) EU!wB2*$CBn5c;R!y9ˬ3HP`8P<^E \hK0>*Jaʛ!J4٦0;N; 5K(F[ ނ& 3@A ry '۲>,͏ ss@Պ0u@k$d00j`nyէɆU?;YE| a׀"R\>"Q@6∗9 S[Td `Ȫ=b# j5#:Jt2RZGkX\@X:A A9LZ_ l#E2q^jwvBrBk>4ra_%{y}&^&Ckk!-,Ob>,|+[>.lDr* \rmP&ʃSl*{">th'`H$iSfaw>R2" Gm)VYΝ'&g7-O!. TdإY/yPV*k#Lj#4B ͱEGO,?exGg'MhZd2ۧg{'ۼ+-uUY zYNE.e+ND%g,O{pa=yYf ED#SCuΨ5!QmPH+1Dʈ !~1|BٯN*"wߡj Pc$Z?:}׽\ WG]wp$:C7VynLklۙ\o`nc/tlNW̡aoBqWT~7mFP-U=Ob*_Ӥ;޸m} {Va˱AL؜zhD.E7wdCHr/g?aσ%a0\^Kh0AɇtT|Ɍ¤eR{x%tz8}:l`y'C_+j_Nػ,A8Op}}t;m0I*87.-*4-N'.&݇MS^&gn^wBFWNJ)oo@ָ|c8+GJUiKtx7."6 ^4 EEWDgnkX=_P3m_"*}MČR]AQQ{f>I+X6T61IC}gbV^, oc¹)ƘHju\R.;͚XDDVj/t/gv)B%(LuD $8c4R8IZ_1,ZHh\f gy yi Laz{ŔPD F4c>ee ֢uCB/qg8L.~}܍Ƃml?~vlRoX԰o ½U827 L xEIbl<(se1Wѳ )|E{.VzJo;,T)O"(:O%ʌ3Y lĄ1!2fLGĉ:o;cƀ0鵌FZ ih6FΆ%TyOFA6g3J؆Sp9Z`hO􅉹43Ar]u o`8)~픮"r%iHD]>4 \!0jLeOi==9jgawjz|Ӹ og\W]*~z_Xek:.kw{ꢊ֬jcUαM\]kJu1jMCQZoNsku'J7Mh{]6GLEs{1b4Հ傷*m1KȽ+E"kRn%_y͞lgC1YZ*OFr+EKs.B璉;$a5MZʄ2a^!X&HSE57Vn 3OMoKg{AW6)Z!7VYVViR03D0N7).~Mٛu/0{PIf~ `wmI1&WGcXd;S֘v߷HIH%r'"^:}n=;=ͿnBI&}I r[E|2=. $硈`XQ.vtCeniROj@6GOi0'or <>tℾ`3+qIt _ђh_$*_ R`m~CF}EH;4 `p*%`Zx7# lLjF@nx@*/>Ah`xljd!teTz 3뤔CsW)]A;k7S[vVSZ(-ݧ^fYriO[SV7!;ە^A`yX>&ٛ/]&(d8f0u(&n&7 Ljgmg8j>VMNr_oX~51ԼqV=#%"/gv2(A0qg Ԅk`ru۝KVV}3[m`98dM>'QerM.^m cQ,8`(P.)Mo#@3K`)-ŒoZn8jy9aӵdtqU>/?1tR\wX?.pC <9^W`;UnRvc~D_ɢN%i +.盓ARt{|axQR^.WSম|N(R kw SDmHu%1&rR?v)fy IVD;h6pvÂ>piVBKhZaVXVhZaVXVhZaVXVhJ*66YI`a1f+lVXVhZaVXVhZaVXVkF!9/eCH/FH]DE*;T"Kx8pf|e{ĸS3Y k&YD-C̣̈́D;,RstU1JR ѥȲRd Y Xὡц\+Q`!iG# ,iAv3B)DT aDCE~>:Iɮq.6\O%׆s$) lz!Z>ӢWnz1>> #=uIFC$ JXwyѱL?6Ub-qʊ>q&aI9!xJJ&/%.Č$kÂ'π[[%& S=Nމw jSsB|5*~-}62G0u'@.>}8\pKgt$vtƦn'͓L*H&%f.y կPn3 5 헣oci?23W4dptpk")MHR|Ĕ,ctHWԷ@sK =^Z*J6y Jù ~(=g>8*EKђWcT4Z9 ?ۿp%$%"SVRK4qד7GSKv4r@K궪m7Umxٙm`9 {)Iÿf oLOȖ+Lӛ?A 9kvDF@_s6 J!ŲSRlv__T $avg.+Ӝ C%TQgkϢ(DΠATe tEAdt&H%ƺ,uL,X| d)Ep\AA쀻P۫^m8{ {V S,eM/"k^QW|j?=g|">~ULY#0c8Z$0u[U35W+VxOk[Ey54g{gX&r 1 ]t;y!F"{W6-QZX6 *yP~gv>`訜T?A`yBMyUڷL.i0 m R2epȓdD:3q,d\+ẖeߖWQ^P-fݭ:%w>yTa}dϗmmp7BTOI&뇅ri<>Ys;!wmt>87 Z~%yb̑4 Gs6mz"rɴՑ#%U FPUd p_x85b+:/g_惸=~ݼD,m sf,9ӿu9]D5W $g6" c.ᑉ^hHыWI]@ oCsKMOgqcPAK1Va7uBvz[LdjIl,))WL&"鞗ĮTr;3ZHȪ2Jf SMR&2ˆȹ y魴4}:cbV ۩&Gg?*tC6y72*er),H֊(8;m։A;L EZ\in5]Tx}8bԒI KA0zF%+Iٳ/e')R6Dv.s,;iJ.&v6{YY`6۔ 4$PVֳjPz Ԭԋ.?.mh"xlD%X LBcbt -:@q̱cW@MZ ˿F1cUF9H,yK;B` 9$Vb)MZըVIY^=7lnW iz>,#m(踐2$%fQaQ@ 3dAB1v,xkz-h V <%-zH&{MyDhlǩ+5h'|DdW峕oQo}!h'2 2egdKB͒):i>c㊌ulXY;[t~5tK-ڃ}S̫ǥkcOz"Q"N -eM$3&ǫV @ᚦv:˼ss zIARm|O^uI!ı9)JrHIx 2G .Je˳HWnn!>e w)q!X`h@zP#KXB}4sDk6 |\`keg:|r%NX򌓠H TDX4*i բZ[k `9c xD2ʶlw!EQ|J=~WUn:H:|2pJ\ IUTsD H.}η~"dC;?cfqIcMe𵩅Hr \,")S 81HϱY>Bʑq/d)տ~mr>lOW{u{Leݣf d) ; dB#L93)( :.' /C'H04,̚`tHiu_}03;bq:l>UgU62 if fSS_%z뫘W9@qBStcU)hrZqlhK+a NTE9Lf7Ԕ+sݯՃ拓yp30b֙ ?E켞[vP6rR<97a!%)W.V5CV7W73O,o} +3f1Aq5Efp"+[%h}AjuXAkSa#aإ/K_1Kgg~-9*e|Mhⷺ~O~~_}?^{Oocޟۗ/`$vU8w~_=:p=M˶MSŶhZ6GL665W m֊(=~b.jV?z׼Φ6E_f{E܅  Wiq?ʅHVGO@( ;W<D*dhͬޛu@5[ЎTqЊ)Jmqʼûz3:VU1fz `Γ%ViXb̔3FQwAF߈.;O_iasi |LJ +58`4EZcyG\2|L-mD~8e9Ô3+B8hȏ#?(:ȏZ'+B fRJkptŒh*l%h!(۽I%8B>M|uܬ R?`*!$@K"(<΂ c[VjgHIDž䄲vC|ɂg14`ugsnʺwL zMS[:tWJF{pz 2AELy,fbO;g8N= H!g*2~;mjL6TERY s4XUqgi|1@Na1,"ZDlݓ8ôKw`/1UF9L4Hp8ip)?Hβъ<0Oyt‘w x;J'^1%6QqD+Xp܃S§(LX]`-ҾzD.5)Zͫ]9o-ƒUc\Hn ? dUzvr/\8Yi?ՙAfCT8{II2[/. f1ZW A?k=`u7g l8u&= Ƙcd<*%kRY^wtk0`Sa\uQU,K\eC4OͅkSb`~8hZnڎz5OgZ5ӧ٨z'/ǃQ=h ;y*: dA8dPv1̦8{  HfxR5E^iasY$876c+rr rޑf+K`RGSJmΌ6> aLkee[bȠp#Qs-9˝R 3[ "7VU5,[XڟQQ; Mh[sp3z{Mik#Z3d.4\!JYU2H j* 0)äCwKaC"w\YnRM HY e1sᙍ`!2vLGĉ`eZ30{-#CohnDlk;[ e!DDm0_U9V$3*wfϋVNWmƍ[Tv,N7tLVfPu o_p4l NFǃNZXzLl>cu1LR.ݢsv+g[ƹvުpZӚJp,&ejJꆊC=F&o4ß6?ms?HwpPa^s8Ajپt{L;]}O?we&繜XlO2铟݌մfk.ZW- tʥ->TYvmJ »UK_o ?@dw7G}Sg6}Yuېknnoz$xAI27K 0(Eb![cI"kCb=K*D Mct CgH;, ,b@ޢlsL(QmGsȰ, Ƹ#a5(b.k%%R8b [f{ТSځ7>r/>R`}m߇gՂ>9iÚİѠRrƃȵF!g"0>AЏv~HM2(% ڐlj&9d!@xXsA%R*2nHbΨ@4g8 #UF IM@!X?{WF c_f$ ytoO>L<88&)_âDӐDKTUL|ȃ%DFOJ2DшlEsScAmFÂJ'Q̰C?،Odp`\ojVޡ$Ra4Vg^|Ͼ2&^(mS N'*:BqfJ%F&"hXP„BX PVZA(Zkf<afg mc]h]=R6񋋊]ܱ<-퓿 vT{[8`idSt}ޠ3Y)Q!(dSZmKhx:FU1笭KϮm'Pےtou"a|<݌;ڮvE JB#ksQ& >:%]AP"ym_u!E4Bfd($U0Y@`"Mj~X6#a}U1FljDlqЈF#X_0\E8^裌Bmc  UC QS:`ۘ ڽIRL1!eyt0%ۑIHIc،5;WY/EٌKՋ^^}u<|2-`RRJhC*RY4ЋqǞzG }GPa +~/}ApC3E?*k/%\-smR貏;* &_jwA[V&ȐC#UC)B ,LAGXo3QuB1ڴz%(%֭QY䓬=EFM>2|U 2%([]ZVP{o_@P|ٮ=MpxUsmZ0[c߆5יO0!QEAq?z `rv fτW,:BZvf`.d:T:'7Y{W +s`;P۞l%Zݏ.{!J,[ؠ%u1H0ޡ#j?>ɲ5+Cqg*Mp%HJLXm`YEҚ|$W)/"C8%z9;=8D})/fnWBOSkn Wk:pmweBnN@_/Q]77e3Et6ƕZK *kM^ld'>+`!4Hn3n1IKfA"脬e-u}TANz^ve%HhUrI%z˅։6VH$vԠ8(9-.&Ofn/07sՏ+ xRA> "3Ŗ %eY#P޹ ZfvbFC~09Xd_·Ea" fZ-&Ȳx\=/DdK):0^N<6`젪4}7=:A/uaMa=x/Ku GNoKƞ@F`g 61N>BiuTW?Hˏ"l2Qdgx!!}sIwZjn"R1g F lD =gm WacqN"O>A XhC6khu˒ȭsc ׍6= +w==9Zn1\E >(T)($3dHdhXZQa=ogR̃_u>ХżlH]Ek⑕sFC  z0ɣc#% Ym88T{a4bΦa \; 򪢪}&?hHokRu[!(HT mRx ,0V&iZv )Ƒdiȸ(% 2t3aGځAj,Hxdܭ)qG֎\my&U*EQ{#TN}~ʘ~f>4nϿ&xq?9t1(ތ.&5qzNh1^^3*4#$,+O } <$W0k?>w5>{y.7epͼ}漴H")#R ut.NZts:OG+#nunm$ U痛x5jypHw[ OizgMNV.ɢϲf3;LyF=T9y.L6;ɟAm#9u$0wtm}}+ bɴQ?'z~5➇c:~mn{VVJk(el}~RŜWUO<23=Vo 枛Z,ov:Yw̎o_~o?_Jo_뷿:XPh F|5 :+ly Z 㭇; mE˧^M.ꖏe+N ӴY[{藋?L_N穖4w=fժL6 &y_aYzkt-z~MRjKgW!b? mǺ &H7I ;2diHM"!YPCt1Pd-F t9q$}hJQ vS;*2N6N} 5Y Ae$PiKeg#Ta_ɾA67skjyktCw/% Oi-q w |\L1vE@y64!X! EZڵ6^\姺rtY[ŻŤ_Vjl7۵F;]l&mh}9dW_&Xe A#JČ^x2Ȣ#9^kı :'ٛ(]>/!(vwAM;mf?]s#j-4|Gт%L=8n~'s۫W|7L/'[K.N+RԻ#pal{|b}h|9n\z6 u -_t2,R:@TnA/Σ݇P:Ca ZJ"oAwHF Em.cs)`hQw:hF$*>6n}]6X:(ڦ\(2+=q`&P( PNZ2$Ԍ=OgC%0> jz|dlsm{zg=uzyA @h*փpN *& |ԄшBFB3#bd" DxW%"RV%lʠ9sްt}LI!f߰ḐcZ[6ЪR&w'=i`B&Ȏ$]4wEkMEG8:/g]qGȁ,8㑒6)EaKґL,ӥ0&*:#x!ޣ%~#oQ^C/xwlQ[Ȭœn~z+,w,+gx|xt6t~Z`і3'vS| :Drvx q`!Eߣ6C܋Gj-V! ,õx1YYfȵ(EY}X tJm(ԙS3ƛkPH (&Y$0"|cWa3rvŐCr~E+">a;lfRk1>8~:gH{&vi :̯ Yʌ9002q';uɳ䥴HC@& D`RX(H2j$Ѻ"e3r;q t(R6YnxbŦ {|&gǝO?Lk}Uw>|v u3o;k`[TRut@jEء#ª ؖ](WtQm˽fl˽hio[z,i<"2VMʥ6DB2"FF_0I5ؖ/ܶ7ڃO{r5۾Pׯ՝y 03/ c9Z%7db{} VuFiHC,YX Z%P !\KcЭ]:-Ttk6Kof 3ٮVX-nz@g0P|9YH.(0]$ ]!v0\>G?u~ !deAتc`|4B(R2A8cIE6m'Ga~frv!m%FfC*6[W;1,_pq3#gOɼE1[b U}Mwvl1'1G3!,wm K4R/dw dF/1>%)!)Dz!ECQH!3==Uտ=*;v~ Ŏsf.xAh)uZ0#u>%)BtjQMJvx!["?grn-ޟSn $HȞrah 3 $|D@0E@$$bhxE#d^ٸS bx:^@:ByR t$Ņ* WeR׭6:tuz 4,f[~f4Yw`k ݲn੭>>#Pp!D'AfB *0hpTh iG&ЩFiyH x g0  "VϢޚM\hii! 2] Kѩ+5Ǔz CD&),hnՒBЖH2`_nK@4wr(9kԫ饼]f~uIvSǏ1i2:)UjKEɞf*)>~Oę}infZN!䖛eqAjb*,H#D(1Q/'RHslzbǿ~ST$^#bxPK W}wR4ԧIBtL -~uAvI-#8ȣ9;ͮd-غ;띿rQ*3?j´R/kwU^)2މ_5Ӎ+؆wů_bir~_u^E~{C uՄ+R'+Ki X.u3{c<ʰLLx3'\E6YlMC nּӮbŀD1<"zD@ۓy wҠ(V``L"FH"`m%JwVۉNZXR X6-O*PbD&-% @("2(LE]r2tTNt|Gט.x lB^=uɸʷqD]Vg[ą;p51[Ov'< @|$Y@FpsXuVNw?,v|/>~q8pM/՗~ÇWKgT:KYIy깣JX?o{+@u/6:WL[e=?^_;gwS$q4/!~Rxs_^}l^8{͕ۙu$)qPѣTOXR}}WlI{sB5NKi-NP$Z)ȵYEab?+!~ۻxο]ٛ~9,?SaiQJ8LsW$Pl"/\lʶgG-Yۣ!.XNeR ^kfE4 <}y)xG l#O!EW2{n=|mDҥ+hkAA砐\2roK9jqw ٥F̖S4"tPIďe?=XXHphtCJ 2Fp4,",kyvBvO->N]Zvr?f ~Ϸ4/>y襠)Y<~@xLQ-1GM%N" 傥))H:C=Iь#PNG"o, *FME߽,wk/ï܌ҚFbf7J{:;lr>}|ݲ-l{M(%'5 4>9v$Q$3.w x'K0VlW5V:LaF'4O]!/)"܀;b\ڸybCB6pB&]]yX^*+ԛRzJ ю#Dۀ}J#P؀9]>FlRBr.inj `!p[^9:"P)c$pP8p6QLZ24CJ"gSKmC.V_B\ \zέ%4^8BQD0*\d,(,ot)K#M8c+J?8"Ը4/2x-3Of JJSVhDD%YțCXqNJBfžLf ^²}&?p%ʩFVi5h)F.9c" (0sjjh_H Th'KE ,#)݄)3/E( ,HEJq;Ԏ̧hb&8͍ RE>?&Ć^?9t #? 4,Z ] Lw0ʞ8+F@$&)I'g\˅0Lo{0-~ٻjXܩ|. /T佺BhzyR2^eh d aPN)9N 3T'q`7HrV_HU zr$7}83wqdC -Cu!c[)zpd_,U;?X_*@7D0Ja-6\qSH?RL]qOub^mu˒ƒ8A^JeI;-XX璗_Oo^ū7)_7?Q_gM&&m7խ/*uk\S7 }yGp߯[/W_ _(/իyvԜm6yC//A_r^MU[1~;PCFqشY'&w1Σ6Ip\q FĐHq2c)`I apDMZxMb'=ڷd1W@ŷFS"pNỌ r1N|VcJISd]U|;I}=duɧS@r+ɣQ{ZvU>C)XkF|pRVk9I?!,,*LJ xav3z/CYj&_Vtig"ϷIi%miLwĩwc:N/nbG]"?TZIJg%*T4*~wwqjDvm>g` ^8SmP>.wD:R-N fA BRLL[DD LjAMa n;?rۗ? pA~W"(ЋUH0 B>J" ]@$$r*!B!DȾ6u~$``jN6P$7IqaBiwF5uEf.#^aYi 2yߦ]n}m5v`4,oOmG LB…m )($#`)v0BK耧 .硃!GyH x g0.1"D" h%E-5CR gN珌OS":9M$gAs㭖vK(nNo_Qsm5/X/ՄiM^o*酟5M+8ĺw?^ݚi,:Au0*JG2o푏%dBҾޙ_Nތ?mHtrh\tr4'S [{yI/r+#rb<^\f|ޠx^ف'Xsmt{i_RQ IW'>β~9ɭ"exD?u4'Y)/v`6v7.,eK5ףAߜ`z2,+u>Oy V;ۜbEê^گv8Ac]%{x4zmy7mFXtI<NR!^]ԟ0 ?J!{=hWŽ<97}&i^Fқ'I>ܐmΣY3?l'xuM,%Κ+kj] 6rޭSNHjnQrm67we8769$:/j}tg#y&Kke+0W\ȟ|{%.O޼[3iwoszi߯鳻v~A Ncv6Qn;N 1]͚[޿)f/]Nb<ɿm2̦o_!JveȯkLB267;_\o4w njțZɳQ$Vwa21#۔KL0J?? -53rv֎ņu-s';ӥM#AN>h.R ~vTe~1d&*l|&a]Ž},U=,A) [U-pŭA'tHB(* ]_"`JhW/_ q6M/*c\ԶrDLLĭSXQW.}ފ[=G;@.bS`cxLNU4$-v!B"9?pOqiQvkM'6DB-MEcڍ #W3v]1sJHm oy3gfz >v嬝B М2|H8F2e:sp##MacY]\ tnd 'Ui]N*`DL섿%-6 =@AgIx~%)&H+"KV'kBp2&mYjQ2hjb (w[ %QZg HǤrLfSfm"|5^xm8j"o/o*?NMMomBʝpz,^yW]g&0f޿xZ1mҝ9+Fn5mU*[Wm&BWԓghi-ee KW`Kؚ_e`lj5W5. 6yv7vwL{h;nчwTl  [[E=obþ@VwmƦ~uru#Rr|n3R ?ן ꄅȌt-*HVJP\pB.V?aR?iRuIoP^ߕz+ORɷa-zՖ-H`їMeތؔCm a'(j|ON*;v @ּK\cIñځ˜'q:J?Xƕ"0% H#c4F]t9 ݊ SO%#:{˲\8scc I3=L*$ <#:ϧ9S#Χ/fnBEO}$+)GJZ?v籨֢< R"ɒ =]2 #( bQ0wUϮqI6+gZ Q# hPR+d›l@DB9VS_-y*sr]r@} z6ͶFC13q}f ꬏'ӧi \Ծ)%]ed€+|.Oe@Z&@:<$S*4JaEa:z)K) ""JbrKou[bjCtKtFb] !0$:J'\:2# XVD{@2>9,h)D<vŸZcDpJKo.e.bȟW6AE5'R`"I.| &Y y a\zo@lAR(2JR9A@)C^{-TH)A [``Tid,fnd,Uaa18 ea,]$|akzٗ{PL|#g*;>䓫s&`oDI6ύB!:1kˈ1n$'f pX# K%8 A1T؎ L#G/u2g7b0,mAbq(j¨:ģBj-%#5: %8 $Rq]ؐ|M3e@b\V`E;%bە[~v&"a[u|kcrߞޢA1NL{S,G8"XSgݫkkWGM~n>s80_¬w_s0,a\oDG]6 k"P-fpB~s,ZM nDEa6!s=*<^y>y{s`%tq{vz:"Q? <^{Γ:>~~4l18Xr~xɡYLqm,[xZӪi4F{󳃿<}zxDK{r9X|=w%SGcf0Mv1 y^|Q7; k#}p!Y: fmm/:1G{;_?|~r# yN;/agOfmMy^fKׁ =dڐJHW̹ *Gro emCLʐXTؾ>j4~*V|-6ѠeAmtX!/su#]h1"]U&X YŽM@[ :ϞvvkBY?.r k4nNfbNGG?jNOM.3)8kߎI%%fg+Z'A#J@`-.;QDsϠ6-޴T&C*R^13dɺ5}P'Ss.=n iVE8U![+DC*))Atz9)!K&V%Jѻ)$6)hK si_!'R:a@ziڈ* RJTm>6"|IeFI*väDy آsE6G=Tʍڪ[^O:)Jj]d*DePˌ`*1#K[=x"2PAzkQM] jV$⤩MkurSp͍rSҐ'] +U@ &eCE5Ff,(*Fb1;L ;CTXD t أt%i#*`hAgМۦwZLĸ:i&zPIQ{/AgݼLFALFzfj)څ'k=Dv *UWyƛ"@[vg^ypkJ!=c8~JȫZm!ƙ)m(W7W;tMd =/m?^cwKbрoc.N.H=x-&(7`'H#}q sK\\ W*U? \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pWQF%W1"9̵`=p+WFI!b+b+b+b+b+b+b+b+b+b+b+b+(b\|1aWdኬt?#p +b+b+b+b+b+b+b+b+b+b+b+bWg$ WGR+Xg\)J \+S0pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \M \$W/8bh Ú[y. o8(@͵wdmWQmq+2O_ǧOޥ<:N0;mľaikpW_ZsF: J#HyvIH}=۱YL͕f@C-GI*Ve]E4\ qXe`2}ܰ_]#QR y ,ߟ)` n0?~O/[?q׋|쯩c3{'7 |/|?0Ňc_st'~xT~^qӜ_yz}`Y#>̆<ܒef~'m;:Ouq\'b,2Π,7DN Dlys~qc|ԳS|!/ގP9O]Dȋu^|I41_cl?8nC v[{7WG 1D vc0wKG{njeL`- &"s^/UBxvQdCJZ0O.i3"ۣ<H@ZN=(\ԟ2xPyv3I_7>K5Po DW[k 1}Kkg?H_M}i_O!KJ6o:fſ~ M_4Y_ظ[P|9xg_6VRxY=>m !xZ%?:]tlj^>c zkoVV_?a\>7ӲտA MqTZqE+.ܭ|X,V\_KLbv0&q6?Cٴ|;僃/nWHME%&qzu8§=p#0Yn'K(肖Bv{"q꾬fiH9]p8L<Ӻ30!{$\n;~|tq`swkN h=xftɴ4uXϡw&7 ]֝*}JƸRsJ3g||Y4n:Tuƀj9o(Ilsqť"E5&k*áCII爐CzM͚|fm:yHyflT8gV PYbĶm-a ]1 O#5|NKtcki[r]>9_@醼j~kYsqGL~(DדWʌFhgFW?rm0ԊubJK.;͘/sϕ  7%KZ-];Ig%DIDHACH9KU"uA5e-z6d92p͗ÉTp:Mva9~l6Ɗ\.omR#Cgyf\{9epgKqʏx OkǼ Zh":u x@iaPRn 3h+m,[tS|Jp_?Z>v?b+_]IZ_dŘXۀ($%#b**F_ʱWK9f@ՠƇB_4=QMJlF2ӟ'1YhN}vUnM m($HAfm:?a =jwђJ2 oP$9*1߱$_(, d-XI^t?=m8Gȏ6YWlOt)PL8w@+n"󛻏F8߾-,QS!KfcUÀ &9Ę*N)-/u$K{)tҠ)XGU9Y&_i|Kdm evCAu/`c+\v\VFHTv TIh0+oB"@A8!Ҳ)w#oJ2el$NiS"sL 8 52~TKɥTپҊJ >Hh"dRb)a4VOœ/!ɠJ۔ D AG(mEWJMIhzr„QLB&,(+L L؎4fq*XhG,<{'˻ʌwiX]/h%aĎ)mʹTvQۍ=2ؽEg(" 8@+!RQ& 5:%Q9X< ط/ƺ" !33^tb k(dUV,G(C`! T7a3qaԯU`Dl""6FDqDĽQ">(d6 0*2e|P2$\ȹ)"zYXgZb@3bBȖ)'؏ NBRN#b3qH#b:iɩqqŽųu2}U X2%2Y]ZVco qylM7/IO.μ3usLiDg!ggMbF:|UpZ>5j:]PY蘬. ~\LmK-ǷXڍ;^ֈ'g-BBh y9F`_mF\V9ǭ~O| ]?tٻ6r$6W! 38` >.$_/-˔$8,W,GRtִ:D*_,Q2G0V&CTt #@OZbGwpVJZ5Db谌 VqhiVEZeSuisG*; d0Z^_FgksMM[d .eueϞL91L\tS("Ҁ(gTِvQ;:N[ H)VPI3:V7+bs星Pj˨~qۧp0 k?֑#VZ(-B ˆwqٸU"'8L>ާqIrS1ٹJa6_VWژK&Kx,$dn8*ǩ-~3+qM8@ 0L2 h6''xv#,&Hrtr#fc1Ldfq7]Ҁ v}"ϋd֣O~0Om^}HD#ZOGP)!Q.jrM]SM_4W{WP07{6Lŕ /\g)0gpGÓJw5$j_/#9p0Ւ1-YW3ue 1x(2e&G]9<Hgꪓ]v\ r9mW6RV>?MN_izKhq=$_ӏY,%M/WGYl5,qWHl2-|D+OB^|,. t-c-P/ap:Ǹ#DI+&Jd%HsLHɗd0^Ju DKҪhvV<+wx;NAO}uxzk[%`S%"4(WHq(Ch@рdqD4h =ܐmm<]\2s잚t;[fObNhKux5[]_=|=he2,/ё_nu ʃZ#c<ˀ&q\&"ɩMY> 6tǮ\MbbumqHѿ.Fu%ɖ|:`N-8 n~s۳z׫_?ƓORdCx8hp E=RL3o8y&ҧQkpVkPl:ӆitQboT/9%n㍟6_*@) $2m@/ o4gпЩCJPjDmRn. H,b$@˔3heY!3KBȣλDi!nF17$Qfx9q%F e@I>36+r䩚8G'N'KVN(.#fU5| c!Ó|ܵg|GE&1kά!bBD˜g*Q1fB+B {bK-</{Ӯu~K3D97P  Lo"f$ƪ# ќQ*5AdYk"=6bƝFvk#{`+e:G%$Ι0Qɂc\+52>8r}$|Di؎ϋG:vY77J${9lױ3|^&|<%K]uϓdzɸS6i8R̍7~7AFo?!.A=wo!$ўCIq~YyRJ}<@iN4R_[} X47=C)䐖6$7 tiAMLGfUVgi9N>_R h>>?. ݶM#tyg15b~Z 9{_FgG;'@LΓ kLzf#0j* }l јr'+uё6xqܣoѩw9 Fa3zn*jI0"jW&g}v\2 6SzZ[,D4 ejKeZDjᖇ](VtqrI\N=D/5xg%9Sr8B`IB_m8u_-\t/0d;Y`?cx>z 0dP!),9 ТdA9,}ԅSWF %KF]MKT 8?ukN5>?&gi v/]WhQȍS|^%+t4 *βdwH/F_{V0`pF< {pz Λωp#:ng-RKD^ul?smI^]K!jk7"? ,ErʫiVVe`@ߙ_Ldz)o~eB.Uh@.d2Gh3YbJ˩?wҶ?񇿰R0,xCvVVAiשt::iz7+2k!Sn V9E`u/u*Or^v?\WukXJ `U&@ MƳ!zfъ# c99ȄU='3m1*emLD.~&-&{4mĤHo8IJ G&"@~q,bմdW*E/_X =@с ʒW1 ! E$TJ`&}CմcGx=㣆?\>p.lsf9sbk.ڦQ,j>g?P~ُi̖x Y!n؀R%!x%o|{L7¾/,ݙޙM:cwJJJ/9Zt_k >j@_H(/ KPp\$8[o uB.$xN8[+&N[d( cCKY:"`yaDV壷bHP,X ,ȐQAKR+RxxoEdq#뼓(6s6%*[MHR 7C2zS֔Q#JRopY'OZ)o#B eBT]* i9,x0hd9|a9$L%)-:)d6=ʱ Qr>UeCdZ UWE=.,Y}3%  9][oG+lr~1,^8@ggЗj1E2ȖOpHGDIcز陮 ޒ] qHKMn)߆;Ǭoc,yTsozZ*Së_[iyׇE'4\/vg[9]HsZb :2:8Jilxx1&J΃V{ EWr 8aO KєH&C=Eь#RNQ́DQHhU(C-d85>C-L%ʬbN0p5@sؖ]Iw_.exڄrƽ󚈫î[)w6Ck; t͊#!{_g_wZ˄t CZh+_"3i6FI!= z6ݩ 7YRep 8WWޅóqLgd ~*3CB\B&qѧ,p"N iK)Lj=~?j}vZJkeQ9% hWF&rD0NW<hI,Pa $ݢ969f#5``EČg\Fg}NUN8SEn^gPuz6Z񬧵]|o&MYr30~h+@wV@ V2s`L %'uBǖrz4O KÔ(\h0.6$9\N.["r & f HΓZ/h :ŋ({#w&ۖsj2Z A! aD]`=X4a)^ Ns*_h#9{Gz@ %m%!/7Wr!eN{Z*x4yͣ:q41;a jNubP΂MG(%M .Ink["N;9;6*NrLg\.IΑ{$s^jӮ>!8B^z.:7r"pkj Qk#!XX~&_WV]c+_? Xŗ/aLkS0-NPD2V`F# % (.hvg=n*cnֽ²}PUp)%ʩV9&t*BQ8c2Y%P`?^tK$Ж2*]s$ur DIKN`A"d:AjYn~q7LPۣv JjYH`f6zí$QMvrOz%ys>C?siQ,ZY.هb6'#L4;$jT~ r, ݫ{檫aq$Xnq">g#jN(9甇#7_Mώo ,܀w@UK6ܼ.ɍ8!pR|<\*G)sE+Pb*ERO*MId|K$9)QثvxICrhcck).!0>RXSݺ75X~_yW]x=_Yg[yVKbܗ {Osܮ8H̦Uj7n> ~4ucOBn鬩܍mfYޣ F +3g1Ab1ѓ>{g5Ci앑{] צ *gBFr8G&R=+z,?Tk'1Xo0\O훿{}ϟ~{wɛ9y5:BȹkH$z7]붺]s#Z6G]O>6U+kk%@(~}y=MBpv=f9?l y b (g|1IGTTIVTXv@imޅ [q`$79xvRJFBHs1k>F(LpGȍ *kNzhBމC_A%5K%F&dzcSas PJ}Vu*5x&ya:;[`U8;V[bZ(V;^Q Fո. X Y0` Q,`]{Jxh<:HxhMgR>ceXLz5 @Jq&vPu%m!X6GzUف:{G\w*_{TxIy`Z2`{i5mpARxKJKEj^n%ꪝ/5{¼Zw'hNeN]Y*;u?]vTvIJ6J#:u %DernkK/'4ލK%o>F;9h.D%Hqa)HftދI3RIi1#eg߇]^-JTGL/ hhw6ͬjyF>_ȏ*XC@)MI]D]H2"RDKP $fZN#2/ZvN1$3&^@& 0Qq4(\2ja׊o4v 48 jJOɇ?jYw|xLHBHԄc B Xfcj¢) T)*()vp<=iT=y~DbHpؠ LZt>%F$@FKgb``Tp6H7&;ֆ'2@?mNmSǏ9SAdr&68%my! H T%*."餾5?($C2Eԍ@Qۖ]&-6xdܓ\wոLr᤬R)j3*v8DKIzb0N-e4I DujjVKzOk螇S:^M[RTB!YZ=j%C JKFq'at4XWŬc<ۼs-87T~q6ΑF;\Ձq!_h A)MdAH=Scz= Ud:Y~"4fqr, ('Ԫ{U7U7k_ Z>QB-GuhBhk<ىP͵$RW\!ҷO['x^F%E\F5z4jdV-|i6]|*/,XMp6~;LAdrvR?#]a8ִP / es<[ W,aϮ7]396ƪ2bTͯdkWbF64EA~Ow1Mc79y8)?ԇl͗57T)gOYƓd]I4L Z𓫥o擎ߨ2Z;"2z=(m;\|Z\UTڸ`69y>ΓeG KJZJa}R%Sp},HlE[VcL0%oCET,Tn/ mi$3Y˕R{8T"5cM\U9øҒ|Mfk>[3Ŧ3;=~ۇtw4}-3&/ j)ڴS- !%[Q©fcs5N5j'B%1TrL)s%T)52c%m'f> 6)|I)gWs_FN,y>}0v)_| 0,_o=eyNWiv˭WYx[_\-Z`n]pȍ ku؉7Iks֚nZ^ ѯWOfyo־}M{7Ol}[Wnݾnf;Y Z|ˍKW^x1ܭVo85__!P/v]4nMp+XC췕Q#mA6_zFjh@~ui˜y~ ~$r/t6Ǐ;~=cKӇCivg;i{glQϨ{'oWȇ7酾v~\OɰZUi.A[kJ :iɤU0!ǖj6ٖ~/SHrRi*Βe=J)%Q¸͑NNBUwxV#,(׳wv{?fgut)V9ku3 :[IJei{Dmf{.kXPح \.?V凓 XՈ-aM u.z7|$5?,l6ԛ7~zrfNy[;X~8b 0ϳaZb8?{=o^^daq;e1[L8lNO? MW=g<8y^GAs8&/켻@;7x_7gŜe9sbnut2t8/ _ȚdمƘ4<$5;~GYyӺnr|2K/G4Ge0n+"+qsVa$)NvtGYڿ8{ÎA741:?2PcܔG7=: 8IKԎԡpM {{ab{Gي3{\c>]o˓9Pyl0f$|}Iy#Ӳѷ^BVԁ,Cj0$ot0t0ԛnH ,?13+]o˻PylG5?fIi{佦#riYyG?]6L#g=H*;VSտ%=Su4coYҷZWxW|O|_L֔K,YbOϲޚk<+&]֫*~)s:ܚV1]֦z3TnUdžl"S|wo>?V~Y7~dC'_ww>|}zcm*^sӲ{fΖtTX=A1{k@+M :?풻cwknm} R7':ՋиE<_^/n9k{ ߪBMWb뿟Tx瑧{?{+΃NJW~wwtW^;?y ܳ5;\q=RぬbQܣ>e~B~:i^P] ğo5Le0'ٜ71W~zr 5jh*YoJ*:MQR2f}$)W޻Uaɿ~^uF;6Yyke?sQo~?ͮ'g6-RV*P%p-9N⼱lJ /V'=ߗN11jT bE ŔyA-ЅNScG~IA#ÝUyZlv ]M~1_jM$uoݤd*qJp-wKL(VƦ%x38flƤ[+}Q&ΜU8?{ [<_`^jki26!i(rhlHsyceXx. 4y zr@#Қw-";۫%+ V+ .Xx!=Ä-#T'ǸP.Nc_6Yyc&Š-[r (RSm58jm7GMU%X!H%9RhfWZ\ Vg))0ލ@߉k}D##k$o"eǬ ])U6eRja~9`jKi (A5D̤y1gU\чUS*6R͸2)Dn\r^l,*Pt=KF }x{ \.%QC@ʤUtȗ ߂Ѹ j̘^RNi,ܢ@ *Up sVk!N=8 5p;k89(&u&ʑ=`yUɖޕxȠ-] %µ[M`s3Ag\ZF4TaCYuq[Aj+pTa0T3pƨJҴ>k Ki)V1TudҐK 2FplQneu/&**ơcDeթܩ‰BA"wa-|g7î;˫׵fؒ|'Z1 ܦB$a5W  .r,,}t,}V2z89Pmi>Z.]U}Ophƍœ 2A-)J$RDN+̫3 Isp¸0F@/> ݗte`&G͠ ۀ86 U&eSY_ }UY;ܶBMf(N6ڽ=Sv^y\US&btB)V5Xh;Kr1oz1΁6DR. -r!-& 7@n05M%$bedrTs@AH ϒ]I`cr'm>ȤQ"`Rɚ% 4>!:IȒlϱ7n,L  3%2Di9DP?z Z{hw EVtEcC:iϢ;k4Q9F6 Pf9vM/r)H*p/gڨ&(6/V4 ,bHJ-ShӸ`ynF˓ysحv|M{yr~V1ve8Q.nnDh hfѣ29GiQ$HuZ*5fߵFH!#!e%BL`ʛӈ ofu(@9RA6r# Bk\ 4(rp7 )cY琐'5)7 6+I^f*O#+IJ1҈vTȱ뇩 O:] SCuE#Hé@Gu127g2ցNi,~ hS:QIǠ7 88kLF&LhIXAJK`S?!( B% >Y 6):Yky~9(J>>?7)".B;*J7VP0iÀe6B3 _R t48&՞SNά#I觙EW+?vwF#$EF$^R d2+*22"3Hꬋg-[Dl0;zr& OD$lwMX.DYlTUQvW8hw""x"`ઁtU:Γu.L@$n- Q܊Ƨo`0=J:;%_7~̫95`hCmXt 3r(m8XG=yPyCUpٳ"@2gOp﹓@ FŊs%@J+z$x" $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $%J~N$,["χ gCMmz$d0i I AI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BIKH eC)j!@Zs$uB Ix$@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@/2`s"@`k d4#ِ@ -gOIHQˆH! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! rHp!.ar5HQIV^͇"'2h8ɗN3/p8wZ.ryWaT8tƠ ^,? )|li|`Tt3GOW#x +?[iK&~];]{\39p*=C9+":eB׃p:f ۏ>o;`m`vxUN fsoW"7%ҁz|y $,&p|C kĄ )9\ 饒25V'-<Y6h(?DM) q Ю׶R^&Il 0pemɁՑʫBt)/]/.x⛟;oj6 lL D'%ju Ʌ|cw/k ,~$hN0O!fi0* I)%0LRo Zrh0 niy aXlGy8̹yKfxL]Lvp`d!'a%פdd_bxb؝nEBh>ǫY4:`r_.|9`'VRg.oҕܠ 6 t ߻Q7#H@ZE (fvo4!tsՂL?ۅ~Ӡ44 w'þis=O_K)_K}G.n3S5}sS8nJ!MjKLAVb}sӝ˳z3&.O|qRj: f3fdV{o0#v'X7r|v]Lr1YJl9pW.nf]Y:&a;sfdo r>{p}OF[—EqwH}c>L<ܺG57SrHԇLCq{ϛmN 0 {k=>HaOg㻞L]JGzp{E[d隩+ᵒ}b?!/)m,KfѺNq;=v}N5v;=oܯdOo 8Vó@}}B#onu ,Mw~ز1[]im=H%Bߌ06؜)Cc+?8V?8 `}OoJt]ۇVgY?1κ5JU }˓6}nW,Zw\ͭP\"LD˔1[ r~k])J@o$\\?T:"dcB}-0O)IZ zvr}PfeWM*5q8i8;90)w|竷\˦܉:k3uNrO'AC#`9]%Y( D_Jt"D.KJX}B|05u)QHBÃwWeR11i5Lt{%QT'&ΑʚVʮĹpSQrN^;101SsQ$.FUt C\h L8Xc@Z`X y$ > $G,DP`ˆG%CWV.`:U )j r!o1+ O}hqw~vw HK&JhI9pV54*[7)Վ|Jgp_!No!ƛ5Do_.څ~Q .4|ƃ9+F))S_ nrg᷷=]~n~O>w8lElIoWmMk:s4UfbW~J6{;p|D{cd ç< D_dQjFfGυiLFxO$ys}}=dsV'cTddUqihb%e9:x`{1\oS7aP"ۖ#1|,T˿zܨAm)}WY`')XXf@94gIu rv9=$Y{J%)鴤%keNhM J2-8_Z gSHJT1Vm U 66-3s2 DE"$(#q!It*bX ܈LbVBL0Yu&Aez`d̆ß%-ɱ@Cf\rc R2z j5qa-[ ҆琞[MOΥHTW5..JU|954Q_9i&l 40BpA\1\3YE9=1$`b攁I#u\(Y8pU܁ &\#U2VeT4T[*BR\_x)/q~m|~&Gv[l0dMumVLIјH)瘁FLjQ3"gZ8B&Տ2` G̈"r,Cԃ-,Z͙tuwU{!+cTfKںd!ZNavm;*@%1uroutE;NywUkv`o ,\T\6 4&Ϧt9A^؎mb )z#-dF؊NL!L"YE'u_L".agܮ[~Ud*}ш]5X#^#F^#x }QȢmLAxDy Q:`ۘ hTz&X> 밂' *G?>R#*vl2U^ZNÀ$0c{A4)u! 6˛h"62/0*:}4r>d.tSȂ3)ir0Z/l $1C&R.^]rmRez}M+h9-~-dͅ:f9licvX{HPOyP$7,:d.]_ >ڒx$r* ]!=LYCzYkIBabWg%q"zY}OXHPN#EuO5܇%WvYޞgXu1&,٨1Nލ>&HHQ ]`Z3DkB3{rZgDÂ^͸nϬ%d}} h A)_(H2j$uLg.' d^UXdNp6b3a][dYjtnYp6_$JO;0!LXss 6A[єau 0&䝎I N(5j듅FYlFX4֨zAY2Rnϯ` <8an$k"&TSeKQ8]Ca^~'+`џ&o6$bb-`ecX3u.*aEskbzHxjU\%?wTt=ui0XUM<hQg/~}F[uStN|EAlϠPѺ$F# :HZAoEP`YJcĎHJL`K&AJ9ERh 2sZM]4;#g>CeXhk׍"G%voal-yAeNgc\E<{5``=:9ـdѱY'Jv& i&$ÌjQK+MZ J/+Q$dݺ^HZQ)>e=#D ),rsHzPWi95*ƃn$Pw,grQ'G71*8eotk %1d XfBձ1W`x Q\׋@"RQ-J]hHb.IKfAFeteZju|O/z;q~N!$#uT% ,#`}x:$!ɢt@W(N&7hWEЪF&"CqjH0.ٛHnr1u{NؘeH*%:zL!NZCY2ubTZg=9:2-x G  ʫLƆ*ёȎWo3ג*hEoक़uФ٬V.lGk▕sF+%($0 ,d^{/PylUӸRĊO'~ \; UEUM~Hhkf֢Z!(HT mRssZv ) #v} Tϑq%(QJef"/9I%Xp%Ա %nHɈ;v B8&U*^*H'>?e̎^?#G45u~@b,~ GybPN68׃[pDϚhpJĚdEY~bH* Ӑ0yvհ\ _n yui ؊J sAV>3^p =)#R up*Ztf3:9+am$ U&^bZָ$ɭkd^'G+x^gYm=:MB<1z4d9Js;)CFd$r3$ 9 &k=R͑#Y{5Po'}lJ߉aC_3&K!m,}t<JSMF<| 2*Eɾ^vO$.;|ΥN׀k`U8;u8R10%Zܺl]@f>=Bz0;#SMQ"7ȆfW*VH}çQ VD#B$1I@AqY}FGǾ= 4Ԫu9p' 2Y楴Tl<0)^mN#vЬonVR(#4RA;r\Gu}8I"d`u.qF!A3 Tmx磱mDtPMQeVBv80ZYT(M('J9{X'N'Kf߈.㭵SV;瘼 c+\}ՇO>v:|; @h*A8'*& 0jވBFB g +^D腷] tM%#lʠ#sV;`\o{fcg~Xt8`gMTn])J%]#HVM4QmFݻ9~jԄ&S B@=J$ B"!}!mLVS$Ivs/eVk_-LעKյ߹3׊MFR$*e EI)hSU+XgԛkԵ[' ̛:ڥqដ]Ц6Kkfkvu^ۮghq: &Y8yuc ?@AhgyTl2;? h޼ˌR?3tFumXܫ `v<<=OKyIjZ|#BS;O6aqE/XbΎU`H;8mowgl!Ϲ,"0 i<{0߽Mפ a=·\XsU˷_xE4Zذa 4ui%'ŦNfӓYޮ4ux]жECv.4Q3o(J6!l5ƚ{M'^> 뻚]BvAsubuoiz60S_<3ŢM'J2J2CN>a̘x}k*͚gKL{[min29{H,9?]'4Rճ-[WuW_rr `f_o_[xt\̵R~]ΛF{/JղE_Mc6N8VHϕD`ɂѵ O{n~|0~}<+pu~#[p. eEI,FL0JAgm2"Y%X %ϴF y%E@ oX"*Imv&_y~VrD|1dzIovwB͂%n=g$*}͕6~%mRhGH=ɓcF4%bJ 9اQCurRV1  c ׵]JeHKݦϲvEI2fOґU t>f sL™ge㬥Φs}T.oS,Ki&\(pBy>TPw՜NP J]$(IEE!]nW辆3};C!8ea(xXZq-!wG_6?e<~[ۆs�o˨ PL G-Β 2_dދJS`Y1 |wѰ&ExbտYq@>W*B/"C"B]4-e' Xie-(=nU+TxOD Yk[++B@ToI^O[6}VdvIRϹqqpggJ& s.EP(IH/AD't$PTK&)QXx.I6*c*#k*ڡp7<%ۺfi B[ӆfe\!:d %WǢ9D,F&D(Ӷg.YS^CN?0/1_DMb| rɌ0HPt^iHem\*P3NEo'=,~cHDl }"HD, /()h[=-1ۺjwZ7z-l|csfk8 Qd'"Y2B'',^8_[!A^2J)Z^{XFG_;:Mw[tbM`7?oi[-Ow $j, 6y`cB%M7q߽ӽiUdI4(ll@"0D0t6; @qlKʆs&PF&"CX\T>0tQA]=s|5٘m)LL<@$rO ,@l'Zd/2vJΚTt\s ;]-ҹʩt C$NLXOCΨ$e]n3X ;|\Pۓ{(~09)YY$NYLX'1`quβA Pbr7ǖjXu͠lXn>09*lפwREқ2o !@`@)]Af'!Z/RMkŅd[ʙڒsm Jei*'X5(BH3&>?(wU(%#ұ m+LT*U>G^ vOi< Ѵ Fq~3P#}LAZwcc:eBP;Gb5N0|<ܡLf2eyskHVku|UΧwmJE4U>fk(dӪBmepZA2]%^ߦk%g)ӏ j'ĝ]<^M;_:e]*->RB ;MLn5uS>s{Wldx|2ߛ-]p0 ;|^/PU#Mj- W VWH,o3 GX5hxe1Ox95itT%uF],״QHX38:nKY~A9MJ/ɱPW<0M9ypryǯo޽~ϯ7/?~~wRwݛ_ %V ;ŭ&z ] ͆k mDo=]uƅ._yŸǛ3J֖~;r=-gZ Ͼf6?|~8+Lc ]TP1m %Q1 DtZ 9+<Tl@y;a)z{ {voӞ(+x5G09bY:Kmo<}(;&Oom37tx%a=^qȁ Z%)mqdȈKF^&Rk1 ; %~[WR( ޺9[+C׆<ϝ WL^o]Otz!1Nyϧñ .x$s*/&4B$kzw7Ff莚!n4PZk},bqfȵ(q& 5@ib2ۇuA*,/cxz@eTި. ߱9}CjGlyw?N'qۗm й܄ӓ= }=8|p u( 5czdzMFg*+Ѩ#ǎ=t{R<ˋb G^/d,.)W(HkÈ\"egӹ'ǣIY"4hnD: 54+z,KPʨ-a7;A&7GϖXw`[TR6 7(A5Qk%Qm˻-Pݬږٖͥ{һRڑ!sUd`r)2 1 E6E"m'n[nW}9٬>`뼲Z9`~x%:&XL$E2F5i$GJ1cGw} TZ>X9&OQ`1{u:^.˶ui6Kkgٮ; ^.P|j~E4ؠ 4>8 Dr~dam{u3|3|2|Aa@%j}B(RA2a8cIE#h趓՝0?tv6@%ǖCHjVK߳ Xu9@d{x?8ӿV9БQ8)4yvޏK9"B̔m_ӝml(.O[!9B,:T{L5Ga 10WS1'4i1">U:⋱9 ujqR&ll638؏W{9LG'YHeHPDCRb%[J+2%DFh^0 8*0!DPiyJƽ`stq Ztـ/w&Xr,Do}HNNF1^,ǩV㘑|[&?nEzd|Տ_m<{<.DMV1T$%W>l5Jg$3v{Ӎ^=yzGdD YdQ%2 .'EΤs̲# p0&J/E%sDe!_b@>gmyĠHB ӽt ̢θٻFn$4.= %m ~`0sFJ3E+vlI)[j]$_=Xu%Oh7uZlس!]D=n70,J񼁻Fwl/q^(@%I yI'fk "UN,ًʉZLj{bM Hs(T!&{9W]tOu{8[sXـpt27ͱ&2r_(`&Ŏ&?55Ip9͑F;:yݿgTo&Mjn6 w\x0obfd+ۥvw_]KdGHo3Z]>Z]]Q[] [୛^# {r+o o]+BhgU[W?f-~Æ#}2w"~KxbM3]ϫ?Uh,PS*vʥ ;g(7Qg.iʇOEXd*v>&?;D&2}R ~< y ;ګȤ񨉩|郝 W22rse仑<KCrq_pOZMyIlq~w&ndMQ'`1MS;;y<7RהN?i["[/z5tORצ|vdL aIT-ugiwl6sF` 2A]#G F"&kaaGg&yi2kDϚ.*=[^Pݻ6]˜DD LY%!(lK-.E ү/B+֍yoPv1X1m=M۫WG4io@;q˪uK.[{R3nF+aZnjC||Ws#K$qɈȓ5Ѯ)@UTe> W8ͮ=p#/A ǵ0`]~_W^;oKJl!#jxI\vKLw;N[uv+p 6NiVxC gK{i6^5SWOlVv[g/4rh|j㷶ܺn[-8w;vm{%+=/Ow;߃QfuB̭&KΦ]jd.2\/osWfb_pv>q:Oiʲ *0@dy mQlӃ, um9u9gyLoVeʼn6=#|S)lz -Rk& JA~!,#%)cOXn y"J-@}AED:υfB)jcAwxpV[8QȄQx94C#gOF̕Z>gF M| b>?z80(Iy#su  J8ɁxKV'/*AӚzx鸷ИswjcQ; DJlzA*K#Yzv=f9͂N4 4xtTaEĆ` r Rs.$716 )Mc^9+W`@̂yH4nQ<Ҽ;]E_ըMXdVB]c]][+שt~Lo{&*T8)@&޵F`X wI朻!"M3ՇN]6`w #|7rJ˳tz ~h \݇n,6t͏/~?&n>9 g @x[UIn ~wߴct1>ڀ82lv[ZmxtЮ8A=z&ŤVhHkQl0g5HXA Ӛ#"vֱ$'L A<{օW}ᆩ(Q4<0Fc4E'gDqdad,hte\8scc I2& O5Zks 1%qtbKWl㐶ZviZ\1-癠Bq!{ kP-%kQz  VW҆ڝfHD< q I6WlOy&F!ѠP-HH!I( D ]HSpg?-WguF.@~t|oٺ5SE1sv䌉6$J8_̋؂sQlC8j 8#Uu9tPAG{J[)Th47̂AG/S|)M\kԜN)}Z5~6brB|i1>9'Z\j'i|Q^RyN4R/ɥvBo2$k:/!KB+F,wT;2RBrha!qPD*$ƠSJR-B0 TiXݒV)& qơPG=zr~0=tήg~>mAml%6w#Jj-!JiTyr>GNWo~W3; nf 7Ex[Ih(^GqMc0.tQ 5GAc0 LۛCHч8QN(0X|ͷb0_]TR!żdn)Jf~Ԅ}.U+uO~mƼHQu!OvJ;h(8WSՖ ڼꑹZfےdo 65"N+/E?\?~7"EN:RH5xbj'x3^_e'7<̫˚&../>l&M=+S7wl]p}X_aJgKcg-q2]yfl 8oȳ?ǓO|WY~<~{w~oUͷo~=z9|*>->FAsM"?fYBF .sQ;)6EjF8|ҚY{T]E!$HPϧa*m;Wdָ*$F \0}`+INRƄ0{٨ O?էڙ~j|Sk茅߇/,ԆW U0GVOP*&e6\k׎3[K:=zIS'bOJ:4ܨy PNOFqYDY2 bmkMً`TA҇*&r]c@3CQ/Q (CD8* Y@/~9= (:J^m>a6{d_q#תׯ4"-I JQ%s>r 7OEAxZ%D?9T lyl;gh *bW4{m<~ C'VtΫ*hp^|-2Cz vd%(at\`RxFP tJ Rpc)v2e r+(/ F]|8؎]۟o3t@f~^o=5 EZ8W/ GLFD=qe=uޠpY#ٻFn$U&wi.;\$wجaQ%E-M_Ւ_jKLRjvUWbQ\8VgMO&0QAq1Rx1rvKV|:]JߛkX::ݦX+,ˬgoU1Q߬WGgo L6qH[;H{1^}!jԺ8^EYc^(0%j'7C#OR|>&τ9f&7uFi%J:zL([``<)H F9+ qgJ1n%E%^eB@|pa)^ Ns*PڥY=[=8ڄq'/3P㚲;uσܚK} ڞTFyfCoʞ(n:g3&yR\Þ0EnSR :1(gAY &Y$ֈjpG'F'K*pF (@=Cɓ[3HɄCю$5LYP#r3MXUڒ\W%i1$,/,gSZ#u'/YG6ǫ@0ѺȅS#c.X46:"H&Fˍ<\JAJۏ;E~aWl%P()"L`P\.nsEU j؟ v.*PBw.!&= :P2"I2\.8 TZ$6("rv4JA(Zdi{Ax FL@dG#1[wNB+I 3[=IsAyDc_ԥ," ),|z>8 x'"Ad0$d9w"uB孏&A<ʉUbrta?E' kp ؤn: G_}`]{qHM^&?hR>8zՕ#`~zcQh 2Ղ$Z;Y;"!!}b?i$P 'wIKExfthL 1K"^ ڇg`Rk(c$rqqN9Gmiu؍6="0FO\E%c =(dHHn2(7zŴוU|~.8^/0^65⛃nQ\pd)mA'mp"+dFYPSXq*-CjO> O7 =K,QN%*&LYE8sgL& q`a\k4RF!5PS8:9FR"Uݤ%' ׂ( C|~8" Jq{ԎĘ@IMp 9 ,Fo9cRSgC57| 9΢Z_qb`WI>= PT]xeahf.gT\cWJsiK#}kh#Y)X|W3૖:m`;y ܐopBF=q7\"fv٦~7̵%/ןZMtvv{fxjsjChohsek.!0Rj{L]jLmye+U|JOh5]+}rB>ѩBUgrF+d?__矯w⫯__Pf.^oU8 9j 7 ?=x4֥Cs#ZOM>3c[v6kk+@W?,j:jB.Qpkw=f:l yMb׷ (׫j ]-U<1~ت0 зV WFIigi~;IҒqQ6Sͧ[#w:K>a[v$钴T(Y7HENeNov~v`sݔy{E-o}6R^.RÕk'AT[gL-jگ4_ap-P tcN23[gC\L \8RAHB"$Bb hK^D1P7@F)PӫFEL1 1(B/ 슑G: J![v6PhZy8^+&ZE;:F<;lRZ|G0ZpQCT&W71*_Vy[y̕ooTS:Z\UIDwpVzN6{sP{ 0&E| QdVJ@'cLlR;eH9Z<ǮE(6Va|Gi/g8[=n{!}n $FH̞XC@)ZL2"FE"Z"` I&-4B!yrSR ;c"dzjXK(OZ Isir%3k.FN|خ|˯mH>\[;t{=պ IiU!}!Q$lLQSBX4Q8E3*ET%SFL狫8^3hH .`I˔ۃD2Dat&Fg4qh|<ƙ/ *h8}az*hLN0g-/c< *|[E$#LwDB }h#!]D(*mٍnaw ^*ijw*i ,>r5.9K; $fw1ޖ2$ 5jnӽtϣ^?NB!xQu=tv'JKFq'at6͝;ȱ&grcoDlƱ67 ~y4q5ZG dFS H__kքt}rlh=rѢj,g+wVӕuֹzp3{]K&HoZ]6>Z]]Q[]حM/Evp?Ho9A4,xLB+\Vyu'jG=EݛӭoK賂Rk~^ϫ=Eզ8-=OW4 jS4cWwD3QejSzYd*n>2ِEdZty']3l>ziuI QN3A] lav=դYi.)=v7O 6ݻ.ޙwʺUSs:=.Bi!GrG8Ո)MJEB2e#N5jlK<.E͑E/B6YO@wPQLA uH€$ʶ9;TDQ '6Je}TYDAYD9{,:#]:\2\K֟4~њ^W*'+']b|-HIQATY: -,8J03~M0Ŝ`Dd)B@q RQdeubp^Pcp[H2ĘDyYLF)@dKύiBZx[:'TQDJ 48Pl Oa2InΫכqfpI4ytOyX/P7<=Og\~rv9w.5{j0Ȯȫ2QnMB5w|/*?GmOf󪛖z:n 7fE<4b>>ͪ߶ڗ]BsJo.W3NrQ̨ik7ʐn )֒1ZStd!>)%IҞx(㞖gZܸ"m16 A_pIrβ4v&W4zZGvdMWUdЕ[lv:6d+ '%h(qzvA*M&T0eozٟj mZE:>zC[IztOngUSÚ+o4x}jrmvUsw}n[`(m7{< 'oyXfɁ筮]}G@֛ypc6Μ1֢֮-Ǻn\_s\k8u%:uJi@ϵҶ̛ͪѻ ܳHWТhe]2 NHq]y;S/!>f3!uq̈́ %`|'YK%ɋVс|rrB>y+'wݭ?M|Z6S -U)Hä!57:&et(! J]Ψ!!{W?y@]B2&KhȾ a1赟kM7edAu^y7싥h$v6V2w+Q02Dŕzh~}/`7Cr.f>_ k~x;УQdz ^NL0Ӝ*Ȃ4mMs)|W?틿2FoC˅Hoa&hK`o~vxYnwRNBn71|050\Q? vfhglj&]_§Duh8lܡ.RJ;!y@L0Z m{b3pƗ)RBrha8e=k ɠ ec)EJR-``Tid,6XW)$/X(Xx•CTi.v<2hNʎFy3GlAj#"ORTEgIxnR% AX44XFjɐ=α\#'.$LP c;e`"<;}'p#Fy)]L:*Wīģ!DcU&Z#5:FǨ h%FY8OB*0BFϐFGX"ED1֭,R1YxXl8WE 30^ " #"J>@ Phu&.DG%)M HG<RD.8!JGţ#iqdTz'p#YYLJ)抋W\<$JE 0s (`,@ޒ{x.qhB0D9pRt͌x|ޣ V' Y:9wM~n-ld͞]p*+,紊:,%[Зs|gL[˾Os,홪x&TKbl4!,#%CXn y"R `/G/'7( /L(E1A!;m|Nq. ޹ MP.6k45:+0d+4.YoܭW/)Bt}+J&r8W !q$Q '9\+?xŒɋ*H37Z{Kq5\c&N&.A*K'cp5\ ۉF@ZQVpQA0n0I1DUʹߊK!Hd,P|Û풟Lig4Lߨ-')=3\4K FE gyWhUWuYd%bk TlUU<} T*WqQ +6ǝe8kdelHUniS @]&Y}zs"]vKS*_*]>:k9FĴ!&j1pQ߼3&HЎ` {tY6GccWJ.sj5FxIaul*LdS-Bp3l\ ~w䷱7qǑǁqY_s4~];rkpY{ Vz"OK lVSi2{>.+u> x]ѱbZstu" ubrzv% m~yK`4cr<c4FsQ#bYo#QEz*z4:(]BT6>HNdN;T; l~%\5ӣ~#xu}D-dT-žWo죥;&oHr*RHxb*'979uUΥ,'/pɃ0իF(mu6n~ $LKat6Yϻ,U-hͱN8q.QB>,u@z窜U'2J@fVDB(Z̦.ɰ;&υM1&6ƿ#<yg |PxCYb ˁ[ԕjǨol>gIv *۞2eC5J]Wt􇕠%ӴK% 1.~+0#eoU*Muaˑ\ejMzGSk}Դg\?7\𵵮3?QiK8,*oo)o|͒D.#GO E'NPO$j>6T)A=>H|8^8N p?S;˫:ywYEZݺbްq7»O)D8r ݁?5i:fա~/j ?_%T7%֦# rхc%R ]̖a^;{oXwyIm?[yN迆֌.(~͐2s6ͨU3uf̮>#%]fB6:P~j 8ok;8'֭PcX1,d/bșlu @~t5%jȢPJ_Cٻ{ S2k9FoXqF:˘Jo㦗Gu͋:Csd_.Pk|2uuurfy~/|ۥ|\ĐZmں|prjݠ-{^P^tSĔɧWۍJ,*HlHY&a!2#]E@ uTRN|ƕm}Ng#r6Hs+tzb@eDixˢn(2_sk +,G|.g-]v³Ϛ_!"~Kis|g x#=׀_Ni÷_6Z;ڕ6J%761^oPI^n>w#^p^Ύv$?rGy OhS9?n' ^!,_N&oYTэ-*XۦLO|͗Ѩ^tLњVvetZ|[vvG ewu &Cqk̎b2<$7R:q6O`8;gZ?\ܷux`HXKDR鶣 nvf{'WR[=hf3PWwu {jSȭbyA4B( yk=Äٶ=Ң)8|c+ ş'&i)0>9-3SR`sPG3 9q:."ImrKZCqǿ/CTGKm?Bϛߖ?G8V+Ybh@ɩ[H.YPqNPK0046rDxa::rSIHn4, }BYƀq"AZ;ۈI JhPKHȵc jR(, RXfq4wX<rAmV{G|rn.U,$a@(,C;DT`D`#!I-nC Kdž@eF6&$EiF3 / ҆;R <$%R%g' DӼk=ZLB1B TsEr[p`,%d" A2#CPRk#2JpR*y%qRxgDǔCkB2X0#T'#=A=e2/84yIq(Q8+v.*Olt!ЬU*]B>qD4%&$n4IR"%Cez C8Fu}%*R (;hڢYG[5c"C'mCK%[p>QIyQ\ jꑆH"2e ZdWxJPs刘N9%Qu]$@QRBCb. CBDrq_eb${Q]b^ȖVmmhI:cg`̀{pϞ@e^ήa2[(!3rJcK-O݊ Yq9mAYU7RܤnDuԩ_56BMyD3I%µ1P4f, Y'SNcqHJ疒*C*Zfx@Ww@Aj]FآyXA1x7 털[%TJxjb5:2-Gd!kJGG>Liil<W*zUC ж6E#7CA5a4q2F[QS;62w18b'`h&CD`: c@kB %3.ET bXSY(i3XCfRB8[*Vj ue*3!(h}'` WpzҖ!xHf e]O+ ~ R'+~4tոIUDIFPD5iYXK` #/U:Y5$$W'S=RCG%㠄RrJc#d6DT;z WmFJoVMp8.dPgzL %_=b_"͘54U!*P$0bB &"2H ya]^\RaZOk*hʖQjy =MQ # bZ:pĥ8 6*}t,4+ :9Hmi>*\ |tLA-ENpKOj,JV<%EHi$Qd"+3FOR Lƍ}4mGDC%!Y}P*S#fH܆@;C8NMd!*T?y*ƝUK8&KuYN";>n n#[#O*q>޷i%0h6B@Fx4=w13~r9(P"Q E݅Z˅:@GْjD)(ʠvO 4KLʖ$|0jF z,jp޵ j|%ӣ B2Б%Y8-*7n,L$ fJԽȂ&J!B2@bEE5\-?a,*'aާ(P>&C_{YҢ cC9CQ:5ZOz؞fH'mY4g%MCըPH»o B.EУ"*^ "jBZ7i[! P *T_jʫ.0LЃ!Sj1`6<g+ǰӮe=\KI4Ը(D0u 4#U=FB`:°Sӿ4"'4^]fYҳC/[(cբFCk]־7r0ởTfe(9n(!/k CC*樇2tyD0BkܠvjA,c5*#SA :HO(H5:PAГ'JYHqXoڬGŰhlAT(l'lE^H"ץ1z:JH+4I'ynèPHIeQF`ҩj:usÌg=g U4+ m#᠅H2~ zPc@s*g6V*1rJ@Xf=P{QZϾTG Q$jd-'k[֪y~9(J>H}05҃7SA%Vz؝Z *m@…IX)kae J ̀|" =(鹐&hJnÄЏ8h5UPz*J’R=%­6?pQi0n3;SVUa2 sȎI8Y5I0 %@$@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@_0 dPD |>$ ِ@t?yV*$ЗHYz$@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@_. 4|N$PχBxFrē'J1 %@>bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &b%FxN$M C\ k|$I/2¡fI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI/#ܻWtz}qnA/p߹ֲޮr裹WSb@f,i]+,r,_nlXg9eaWaw2py^?5֫\;CCwˋENn 路W x.OWb}Zv1*to+HOLa Ǽ/.hدݍz_)3(3!|U(9*;D3Kd@K>ݵ`q'@'?dSRS'Pveo:Z6؏Ncг4Btjt CY:ZQH=kO6RrD;Wm8n?PIQgu7o޳No5jZ^Ng:ciCp_Թ;aTߡosj͊|/情o~vG=&$GhV;7݁wߍ>?<{4,=EA鿉j^FP.JTvUAw1~Kϵ܏ ']!:N3>m|@C?obؖR󐶝u 7b}lY !\zP ˫ţՄP 컻H?` |vRn4}?~ѫ : _nLB+ߊdM5}FcbJϔɭbrፙTO4|B]<9.S̴ǟ0my[X/6(73(l~t?O_ Afnl5bW3dXQ K#],j% _6u pD<"/ܘd8}vև z՜ӾXSD^)Rp!f/c& Slx4a`̜L8𛳻FoԥPX}!vӧ>nv8qSw#|_y_yѶڶ'6EAzgPZ(󥷶B[́gܜ=#ϯO"ȱ #u]_#45m7k}ӊ#$z ׻ !h4p}G}+ov^AZGZYV+TtUˣ~QXA?2XoJ1P1p 1tQESWvt8FN4w>֫n~a4h}CO$867dH)죥g[ZJuUSIJ^/>8G>8{r>?KƟ//r/ū氈^ή F -Fp\oZO~֚bERf^qYpİ%./.|KZitωgayt'~[ζo*f/zo# OZ|ʌŐ_\[mV?ЮAFkwZ o~%\pKh1?Wg 5԰Vœ j UW?,[y3zIboWb}vݻ˟7ݰQѢ?PfQfY|^\__ߴ,C!W ڃ]s9zE;rEoW\_ ]m%{WM}{cfn fb{`5t;w[-5a}ގ3\hSBXaFP |p8V83.L ![!4c gK8PnǷuo׽_GPSyq~<s òܑ7V >%kS[ o'n my1J6cXݼy{JC1v9So;|%UMٔ>=OzԄo~\KI4o ^ϫ'WPOQLZyc,pi]ޘMJz1[}-(9Dk0l\FFE2%; 3-빇o$o\O\*=!dY?O Yv1yQ$'h;mѼ^_#@ ȵ1x9K·g:8}P -/F3g蒷F-urq>e_lڑ)i2%ޓSX='6ꊠFKK0_{v> =H ݚ fNQŬIj\QIHlVLpc΀GA"Cdb5*Yyt{K4I,Fu4콢ԻX/qj<T_&+NaG@7=9p*# '_R]]6Txb6[C5FМe,Mbs-CgD5ʬZ_Z$j#$J|A T,j( 5( b'3vggUnq/ξG_xt֒ۻl)|b 4-z_n Cbz=*AQ[y廏m~lW[GvlVO̓t^~rm 0Ѯ1<َHuWX8ҒO&it۶er]_\$YXlU1)0BN렕'sPC~'^`|A/ѥ…1RbQފ Ѕz#m6ֺl%CBEJu T ?QM9^4zK;o|Nv+h9Ip>}Ҷ?]H[3A97=Iը@Ck4ҠUv#4bG";H(dZjc2s!PI8VuHJ@s)G<)@8|e}{Cc}q/ob6ױ6EI1_ bZr][{(g?M~gl1`upo[z^}D-OlYxj:Kcýl@yT,FQq_SB^yJ Ih]wރJPYS7k(vx`WoGo"$-fAv>Z1@M9W 4h$vj\c0 λY|f[z x$R#l3@ X /ZU lmcfJ͟5i.!3zXq"k[[lLJRUl~z!蝐gS喂6)>]W6;kF 21$ c(!(M-j&/Yhgg9dL֨2+.UѬjR-3)ŚR+S&Qh <; AYLTGR)2b8O]xf6SϻUJsh*%Q1[ 5"Tj6 t,ВM |{/pOӯ4ZTvE\1ĢMYFQIp0&# juZ_qnb!cLzY6' b#T9.MYJPt^{23:.cZnnNN78N;#'p &F/lRQcT[gd@q)28xV=_9eDDIǀ<_79-0Iقm_TU\"u Bv\9psb.<=Snru>bn&/xe?څavʰ8a7dY0R C{ɲU;h[) \@U=Dy@l"` PwRr,ִTXC + fr.XGm-9Sq;{SjNv!ٖDzN\z'γ%%hȤXiQdv9,)Q,y4FFǀvzL+. n<-d䗍' *p) L,D15 ),OBLZYujc#Bؑ_n$"b5~TK<ӤV(W=|EF匑Ȗ~W8FŬOnpm'L TmڬH%&ڢ7BP*@ e. 3Q'!8{B -LBmI]_sm Jɂ4 td iz,Bm93RnjϏ11JJɈ;t BF[?V yaZm.9$Ͽ\yJ_,?rχtB!|xm$ ȔUMb Gmӭ:p5|2LGmRguV$dʃ0/GV,vwZIe G h&|_{֪6iOߴ4Zu :;R-0ŧWۺ~?}|vupvdxj2¯q O{ޮ9H?ڐoNx#B!6dƑP<ҧMaa։=3bQu&Gˍ^94ݲ8f}"7i,șִQZHX3urϧ,ʠz_C9Jcoyae OO޽y~G߽?監߾f'ށZ|ux~zb追CۮC+[ mD^l6.a/ǛKJo aZ֖0v8ME-ZslL$ /lMGZ_ @]j)"1 l|Wo]i#qtMN-Np;)cD2lY9 &t(TƑY94)NK'+Þz ,0*tYԺ #RɟP){Nu*3JB_wya:]vnU}ӄht9R1o0MW_utDĄtT:jbl 4Dn<Mr&,_Gok6BC$1圔I` OBYj{V5l*'(m>.}ns~uK;BFtU 1 vjyܬyVwY(0 ǜuQC$ȶrRA46Βw@z8H5ݞא6{WCQ i5+_A 9=$Y\d|#1WPdQ\m Ѯ^zrqض"ZDk6몲we@Wy_'"ty1ƀ`ӈ4#f쨽O!~%i"bQ&(^DKFVfFR`t.gC9)(eH1w6xY T0s5LY/6ڇm%L@?cbY]U5][z۵KniCLp4^e`9N~]u j`?|ZYRNƴfͯQcOM7Y|Fj)KE?>v)Ap x: =X!y)mԠ&T.e08<=n *ߗkUV--??jmWs0P\<{{d,\˧yox:Q/yUkBC?{@E(b֠i|ȥ)>dOOv򐷹}!rq>Ml:`لX|ٌ:jq-z>͝NF\x1Q3?Xj {˜Az:Bpn+ՅnlHbi0 D= ' 1!DK)-Sl`޽֥@̻ 5&u1Rj^+{'hs1AuB M%S-Jys-IgYCZ9wi_l~?^}{03۫p:lg8}Z5>ō*ŵ& 'm-lgDwb}{7s$Sgtz~Q{/c{g+^d*zPYKX>ǟxBiv0Xpx P%،d͚e/+c 9`ͳ>t(̃{"|l4[p#@e:Xh~PخmlpXmh 6r}&38 gA /?yuZM GP6TxZH)Τɮϑ?==ue\~AZZQHk*MAXt^ M)X"61DhD$#SNDw=P,/h"6o@yT!gL# IQ )%(CR>PEh4A DQִT F5uXgӹ;Uڣ1cKu͔)tv{WMuI3>yv9g;Jw"lJ>O}3}6 &F9dXymdsRơ'd2]˒]ΦeJMlJ"VdV.mDvnܷy7S=ū#sgl%Uړ2E)YE (X(萓\Il R H&o؋F?%(R!)>DID#ǰejf8]UGW)`9'kn*ʉ"*z( uciwC߽灤c)e0R 1ce`)&<7̬vw 6txJoj OC>wBB9*ibFV烒V<G8M78B$wy > 3|_4w?㶫a~4-8~EX<చAj3.}`[2S θdb-%qߏxwH,x,MX\Mnс rG^|ޗUɴף>\?>a-2?Vg$Y}9mvFxq09=;X>T E 1%?ޏ]ECєewh#tq5zaufWWˋ{~mTMϼiN:^g+y bN {Gӹjn$O0`1}֍dڑ@t0b0n0ˋ-1bQ4e&=wڛp"kGQnuJ"_qI9-W'j8͚%MAqSMk_ӛUr{|7?o_wo޿{.7H L0KHPI/ ?36m M͇6Z6z6ᛌ+|5fYfƠŅnmH4?|5QlҚd *z4 Ml~:mWqvnR%|45"57qbƸF&>N߸ q%Xdi yR~e ^@wIZ-JNzl\ |EÎ8K <53Bh%nw J9A2!a.XGkZd.ͺ'gZښصԚ-nyIh6!,䁳Jl#|8ihTc,'Dk⑦30 Zd$xˠ5rZe9Hi)Go $Й+ ύX2WKE},7)JVAzgXBUjCGU.n<7?]5.)ˊd2WոxVy{+;}w^{Ϯ+U{cc57wφ^\ޔR5x*<מ3ƳήOi5H>(0{@U_܄\p=o,d{Vqvz8dԣO'TŝťxROruJor`ɐ%ZH~pDq0G#z|ޯޯ|7bq rqݗ_9 %<w%zX5`҂z5#ޱRW'1'UIjeprvn(N}P‘(1+w[M8IMd^eK/]aMٕro-MRۯР]5Yj||>iVMYR|:PV+r+>{ _^ -RG>MO!MH G(')osf|pѰ]9YOa5AGgE-;BuRvc |}La,} I<fK|` 6bGziR Y\JZB@puZX@WjYkgb) R.*CsJ@ 2Q= VA.ȼ26DnK4+m,s`F712M>;*̩"pI|UM%c9Pjb*NX$NaYD K%-T x x7 @[0RH|@"LKI=4)4nưJi"ԓY?N$5GfUu)XdvFA^N6nm맒}Xfageor\6[/?n$SOb6Dn3-O 횋N%dsȸY4 S$4I1[ ܔ}hfq }nZ@K* |%)tO@XEՄ+&mi&sݛz~ݫ Nb},[QnrVF]5_}"][.kΦ/j ZF * AYW>RV/F -@) $2mk@/ 5g't*uP%(U٦IR^6>lRN-SΠqNI-$g,Gwl,-NѮD@:R%T*p#2 `RdGr5rUL w(bnfb5\ox/ËA2y?֭OX|R*0Y;`pf%0$Z< ?*&3ZÌR"2 |Ǽ0 =ޝăLsV/Kqw %s^;f}Do#>-&nzhΫ*GD`訌`5r5DꠂPamg<څz[}QQb*&֖$we #/!t΄LZI笑9u~/#]q<gᇏ2fFyؚ.,t ]iL?OYr>SC>r( y5AP85<IkQŌk7Rɋ`w9=Jzm6#R̷]w-r8h8u)s_-}yKkbƦh ;β|"1fTe#iǃD_'NZ0!Lkj/5sBN΄|r.v̈́jȄ܊MH虖F YD2PI{B2L*uKa'sT\**u xW|S"'GZR%B*2մO I2mL.䜭φ~FiץwiOpշ+a`LK4L b:4x=gIp~jjmdFU[&*hH"*۵RN)խh&J$B@8Jʬ.();$^zk- &1#V-J.ǹ A8#9пضK5rGEP [Ag=V!˫_P \]g92Y:aZ3^Gǽd뤨C . Z|8>-U`qpg%-Ytaԥ8KV6rde*D*y4B h- a"C)8&u"+ fH:\I;2FΆ|smeyk;*ޙ&&[*&!|Nűh] -<@ѐAD*ӱ@,p\LPo xk!eW"ܥG.RN0aP2ZV:kcΦZA8gmq G,eD#mN2J3yY%@ VNu'lmMLB1,|k:&U)m3&H&:nk#B0_kKd[-3߼T&C`ɲ@` 2egǬ+ 5CS35B9 (;h=`ómõ[tb_zڭ]R7ԳKCHyBBlD&\~bP/̼Ь {} d4AXĿ,D eKQ5,ӄugM˚;5mw"k\RXr19yZyF8ȸ͈tZb@n3x% NzY @J_וgyGG'm9Lqۗ^PJ1i[J$4j|jg~v9fG[Gv K!E)AS=S)b.k#}0*%fh.ac6Gri=MݝyF!O}a>=^$%Y*VH0,p'CV5S#&l7w>u%Pm÷$D& `uqkmȲOnڮǭc3`_F=%iRCRvAj6)RbU( Ȣb[έ@MR\-`ZqŝQz,uhё G8N͛o~"{/rwc2v`ݴgu{vG LB…m )($#)N0BK聧 'CO'Q&b0L|("7ǃ=9'#H(T!&{=UwOu8Wף/O+pMÁYw| z!ŎhXxI4q5Xx |]uv;MH+$ ;A]zWW6+w+HXq6\_mKdG(osD_mkmổ_sZ.\]3oU,_ βw"VVe,M:cɃi /?,P;Wov d)f;޶bYnx#lebIQS;僝˥ugKUnmpܦ@gq#ZSf@4-w[P5ӛ)NHn8oS_8=LC3EtydZ3nuDYW{o;\os;](YEANxH1@m hottm~1#t&ztԡrY^SvbhB%wA›?nHż4(ʵ>,ʘDD LY%}Χ-Q()==1y:*pigiR"hRS ,26Oː3/*E =>]'ẘnFt=iD\џqݘ[]72DD﯇el|?çva!"j5$Fk뒩%'=d9f: 6O> P~\ uLh-n{-Ҫ(m˃li^6&\/\R D =F\RA0LJq;&K;oL/#!j?|L&ACI& 1 )![OyIy EQ޿ayO#yӐ].;r nnPߊsV^ѣ1vqh6,6Nne] p_|w5c@;Uc e Y{v2;tg,!6ϑl6kv{̿8FŊмEWn][/s4En ~oytU&=ڥȻ~mz݉ :r]nV/n|_g}u5?>LD;m@˝k$LoԎ|ގ*QxQWSX$(vD$qCTPU_ /Z <S_w7ލvz׌< x ᭦7# |WM vcpG0jiVC3[[ifUsfPy=/)֤\|Z͇3 ?Wc5_'M1-u]Yp^TAlLeΗ’S疻׏?:)֑MxS'&S{/iy2_˚֢M:{n#QˮIS'/ cnKln=f%in"lrZ?ߨ\l4:بwpV<:{l;Dv}2晃8@x|ybُ?7nߝ*b~3o?z$x0E<6OcZJ}G@1(ä~`/{RsgZ:1ܑ^ֿ[JzR$5K]1REwL 5g|!r3%W=P\Ͳ4-3r#qY(6(aQeʹ5XjgIābEtV7]Ihl21.:iDL l!:+\T2;:޲l'.e 빱1$M*$ <#:j>10%>R v4ۄ4'䞨TE۶ꃷyqӑ@HԎ2Aekp@WuO*z(Kyh$nsZ Q# Z%HH!I, Dh  -':Pw\Z☕o?fEG7lHŞF3Î`Üa3?>}` EkIV# H+_)rrtXAg{J[)Th47̂AG/S|)=8ۍGa<.*ڪGnuL<*bJJ`]К3hMYvgAG"[' xLh-ŜV"Y1 ޲/5qZ(SCr^9Hij)Go TΤ s06[Zۃ} җr ]?UIê)CUS'X&?zTsotZog~4>Yմx~eVq?4ē#k*Svǽku?${{~X8q} aQ^{yڡ?n޹R5}[>I_m#RG|y+곗 _p#湎W(Y,v)!AH槪RjҧO~fH6ƙ8Wckk{$E,isTW>uڟr\]~+zqتa.aSZwV"zP[ )l5vup90R~"/%V`'B2xT&C%TO8 tFcÁ E9Fpd =es̝̊6 Nv+9 ~VX\݅=GE/ψ~!|%*OBqr:I*[Rf309Fr7b b=k&KΊZꤤ?m.ڴ$^4[♇KY>BdKRZ"s2dpV 8npv).@%C M@/B= r ˉ4 .j,(+t|WXDoRdk ]|Vuk|֕qd] ȚqdmJ;HHqm܃AK/el¬@ͮ-ӶRZs& }BRZR "s/R\}|BT:@˔3heSe %! Qd]"N%ߵ[Rǔ`f$(>!8#2 `RdǶ˟&*U=l3\2ngd<gK:kuF8]Z$fÙ"D\ngdgAdFkQ #V@wۊ.SY_<<>g_qg ZCK(ـo;~8` ߐFL\2Erh}*GX@tTF(ejkAWgJ~W ڎ<wٷG5,FbhmN2xQ"y Is&LeJ:g̩#{%$/}a2`;> t"f9__:+Ec;{$'eYW_9<ǣ' m9Rq5k=m0Cl^%^э7RSٲ|OYKyaOɄy-POsq,9.:@T%in BHiP0KnCBOx8 Bg.ne̵*lM;ƤNd|\p!'B稸T" "'Gd-w!Q5da2]-qn Cs1>rΆ'K`< <8q<t[Ȃ5U׿:]%c/2:#U#h?腔>O-R ̨j+4VEuoH')KI>v-*sJ^͵_brkiL[ˤE qxF&LNfAzV$q-qw!@';mMsaOO;=O.iJ&? .hde,G}Ls1Z3ϲkGDnMm[1I"I$lYZz'f@4P^s&&[2&!|Nűh] -:@t̉US6U쩌~]rvH9@vBhs&HItƜMZZq6إ:m6i G,eD#mNt!eD%f, 7`J3)NOvNskcb4e.?LB1,xk::UIm3&H&:mk#BN.9EOwEZ"U>۲-Sa,Y̢֒AfT,u%wfhyƲF\H6x q|r+v=G'LJM7Kzq7uw$J2 6y`kSwUk-pM£uTyw8$A]Jk=`}Sgb%R݈(Fi 1d cArRٲl4!W[YӮ.8BG[^ĥ4|ŠU`tJ KX%-E5Z-JƠvZ޶q{ijq"NQP%sch,^g~+aSo$4FΚܐ?Ѫs8z mkɘ-p[3b{3n{3)bѸQԃL> Η=h?nx8R vsˇg%/Lδlť ǓXn}:% =^JٱrSP7TӘ*O>p{ۻ7M៿{ͻC.|}+ufKm"("uWp_^MMK whZ6zwiW[ڽ>N_jnmHʃOFPrY(-|1e?JUZ~&56lt-~*bԘK~@`2曲 u%LR9ҭ$xyRĕ`Q%2$9&K0'!Q:"G%iU\;+;K{;NVN}uxA %J s:zl^'YʺN6&H}Vw0+]v6.tQBwµp|6`ֲ2R k.b=ғ6f!Hf(S \:(5,"R{]Jxhw<޹vڝHxl.grוּΉs"7ADmF[iF.g̿/+_d嘆N*jwpR?|`Kͬ 4-DxBhQR제s>+6JHh7S󭑻B] =e^/M;OsIڻThnNez OYz7qG⮂~;:-<5V<'ad.h$hLHAQ%MNq@@8ZItxe8Vs<_!hm +` äYbD0DJXň^Ĝiwꏐ}XE˚t2&Խ@Qז]m .x^Mdܓ|+WqZkÈTyuBC 58MH2D9l$) jY@V`ul`ɳ͟cq%vc9k?}I>fz/=46^φ㋜3X/ׯ_xaU,VRKR Ij/Ji:kx[֞^ן}\qx)[Hxwu II(JK\T&z3Ƞ_mj7~q-OQW8/d9;+Z'㌛CU x?}_ ™ղ/5Y̘#>q,]Yz}E`q˺UĂ\P(MuܛVk2[q_&H ҙ`KM :BI$s>Fem2Dz:R8d (ۥ5ZTjM)wh/մbrWqn)2r.?\d=zwBvyzwZjш]jJzm@X}nΧ+hnR7>{yUm1.w_-iE1o_t|R%j^CR xSCyPIfmޢL4_Z^'vg,sя-NS8LPcMn&Bo5u -Ɂe1/,Bs\ Q\Z;LԹ\p-MѸw֦EBltPSjƂ–ʰLm65MtRO=>վyaUŽNo e6®wy p4\ABߘDD LY%}NaY BE*QD1_߀9nDq*%-5I "sIh=mPi2NRtpRz(<}z_abgsc/&ۋC\xi38MqKpx$ vvJK`3q>_MFD6VQ J+!RjhɌVw 5=].j]qy囸X.F4u$7K a(ǤOT[H&]UR0J!:4C&;ԯvל=f[(tbBs xD F N8\&&:O ")iv@lN<}F/'L߂oנHF躚u`ZZ6fo]Rgū\G8Ļ&Uy$匠$q8(͇燵F-z ?oN}V>N߂aa1P% xOhOw%-% .:Pl`S䛔*2?*`1gb 4+PUUe=.'?3" =-Fcj<*_Px5U}yP9 g_Q#`/&=׽|giUϟ|$E ҺdJAIS\8!ÆsasἧJ,wI䜻!zzd|MP"B.NBiklx.b{ZgmXW(vycOSV?.K@TmDd?6N+]dq/ά#O>/<0% u3Yaa8"''"$Xfh2 N;uDZS 1PKwQO$с F8RX]فiE!S ?o]tP~AJ ĕ3 >OSg;.{Bn'#D$\Иo?e4 6RX*rC]G~9a`X{dkTJ$⚀k,BpG32mS&j:E*Feu 18o3ߝʦ)P5"cdؙ80$qjQbb$ܸ-:YkE2KH6hC3mX_)A~8y[2ngvQ7T>|2 *@Ɨoysp${5z?FS7D|II%(J,sU$ U>~^/ )ַE63n8o[jokqklaV8&s!crM"PU8Nk*yʐː΢Dt!x&Fo&E%4ݑquPn_DLeY&>~qOa8 tYUc;]lusex8lOo0ߌ6d2ɱud[PP<:J\]uSł^X]57tEȍtYrdY=6¨֧2Hs.an)y'suk.V{|[ux8P|۳msI<ʷT+ަD`MmjΚ?+]vI ȜMvE?H6ŕmxm0o >ҽNKX |w>~%yag܏Oc2/2UrSr מ.Cr_&n[h8HKU60)yH9&r8WZǐ8a \8ɁKV'/t5Kǽ%|!6pBt:Qx6蠵^ L.j FO ;Ѹ1VQ{\pTJzRTT\".$9-C,q /Vsv}YDDm9I3J#nbI !>i\yw^( uDwFX(: FQYVlu{9 P-%+;yv8t#maKi}5:$X#hSC9}ǖ3Zg(ޡKEŧ8jVgEkJ oW`iЂ8l]X?vSaSρ|~-zܶp,lKw7z)!M 56@\v<+5GE\c%I80'$j_In H#ec4E'gDqSYpuZS2΅HeIψO9؎Gg;Z pGؙi&x>}= y5 ̤Td]xj,)]%8G(PFicKQ` -[ϢOy&F!ѠV:<)7) .41@d? 2 |܉ lud >N &L%iCzl"'ϟ8؂sQltq^qFN(rtxP^t: MRX%fhn^K) ""Jb׭#d[Tfazʞ~|H` '28 2Ghi@XM%[!> 04HQ$Sy#8\$qFK͉"w}CWl_r#,Ey ͓K3ny;0dOS=Gh_K)*7dIu^hC[=WX nsdV)&C┡^{-TH!Aٻ6%  (Av>/?C/>yZ",ᡋheD kz~ut(͵gR%c5rJ5YXmeWʲPAeҳ֓4?e1i\Z? Z?χKl`$O6`et*mSB@{a f5Q7aeb$ AHQĦ&j.L#B8s$ꢱCȹ[b(]21wEjW[ڦ6 Y)3HT5:2w6ܤz͢LW` r +:$k 5G,m#'"Ubo0Wȹ[vF}=$,qW$bq_h+KDK^"޺$Vh4ʐ<03VdV00LXU":36F @eH!Z@Mh$ɒdGz! B'%b5rF鲋.:EWY.^.rY\䀠u>FB$(!IbB"R)" R3H\|+긧IUtb.uKQQb*&֖$wep Od,8ƵY#Ƿa븬rs=.E ouҕ<\~믽rr~WJ,r٘-4| icEj2׎/zۉ}Wyt4st )e[O }@{LYCzYKyaOɄy-POs2Xr\2mu"Ja?,Vu>XCG&F/q5i݂=lHPڞ3͎֟,t?5}b_0+ (Y$Xɼ%k7"QZB d˜ʂ0WN*[bnFb=uLHe \$8[Or@*ԉWRC2X]X9[]Ir}rG9J;/QiHgA dU2ZXQ"vZ UL_Ztói\vMm/7]dK1J dc9xց2ˆ##p}w c[P%G"e3l2wYO-iQ{2A*@;]*ff =^xZ @ e,QWRM[d [ '>{H91L\tSTNQCi@\=#Ufh'- qQ+H'TRČPJ0'_(tqX?F< {ܕb·W| d/VzgJv $aoGi,}>,]ԣ[.|10&-Bdtrt-:NBœԉK~@`S>Iĉ6ƭ6U4Qvɦť.W(Qfc֗h!.co AxMٰxϳ\P _BOo48./9嗡7 ?- æ ; kGL=e Nr!c N)(N}P‘(1+w[O8IpɲN^\~-n kw鷖& y~Р?f:8"꿚٬[5e77\d72Xq@]sT`/Y>*UZr^h [܄xb+\v`񃕄mct,w,`Ict2bY +:)unLo)lr!lgN`)&}IB*`A9ib]iBo8z)7. 34٣6!r9SZ2V2p2pDVGetȂ$.ҟx[b:"%.Ѭ@0΁%GT6}vU*FS"p{ծ&gcQd,$ڨ8aylr:q̆e17e"G-T>Bc;c'kϸĖa5hk?\FHN@򥤁?`՛K[0*\EhgaN$5GfUm(Z,(h+t껧 ndo}IAXzag`vܜ:X/߯%SnNYjҶDgZAkxV /Zn cj܁3GMhx ,=\)N 58Ǎ\;.(tJ= \@8 :X7ކ_̞F5<ҭv[uZaMo9Ϥj"TwrVf`"ݰ (6'#E52-*x@pc)iK\Oy=iM$MAY%U+y JX.'23zZ,JZI$OIC虵Lb7inřK~ڀFdWiJ J:7%!(2\>JG0LE,QR'6`'Ru/?U7 fq]3YOJ3)]2ym~@uZ..d:c'W ȤEP8y >"RC#54Hm+]܃TrHTRcșxIleaRr,8I  _;/3K dw` (j)E]q )rJ2j*v}gyѦ72ذgw] PԵf NdGx !^6p. ç4x+܋F ' x(hT{]k=Rx:};[ ;i] "uR~6',i2Ƿߝ;4h&u&I%c%&g0翧.8\~T8bMkVՌH@M]d_`WeCmxӷëQlxT ~X)W&R Bňg$uwhH<PpIFdvwA^wO۷ߟ7X5Ϝf" g` So7lVS9Hv0o#4 bz'SKoDꒄXHf7; @su>3Q( Ȝ1,S؈Cs@\a9&ՀΙN?-w7MH>TKfrWD]A,U@ܥj [{ܷ7PqB˺{^^ ŏ4Y"NдEںnp4*z`Tкh{(9]$PT,/Ž쎊P["_=]:գ˻ sNMu6pu~":rȏb)K7FFH|DmwFq%hr3\ORۆbTL^Va] 콇V%-}.9H 0[i^Ko$~B`&maqKg&y B4lb&uE!ﻛ{^47/x0tkY0"={~CٳnJ[qLԖc| fjJHli,5Vc[nӉ}rdqF3Ao@w&&pFi- sHY mQKb߈[]==UlH UFxc.嶿nZw ^Vq>i@ItݧYKH/㳕IdM -Y=dbrte>tP7oTqݨ6\K=i14P1Z D#K:px7.'fQSdSr))y'ߐ&| {I13ZqwV8+*L;FAFA/0&U)sMsBpqCl霌REqelg0@-}|oe-x<G=["jJ,o*#=SցGWV?"T$az _g"|R~-rf1[z"\낏5 P\J-\ƩFXƳeT,eM삏KGAWvwVY'T(VM{ - | * ޥfj6\5H m,\dOsͦ.[|DFEo@9"~R؟Vƨ͐3tk}5WjW[k5DYZݮPj2a+֢ƛ ~\uXgLӄq 7Hmrnc#eیYUI5ŲԠ‰s!: fs?v}::[iݐ~ PA&'D="jY uئ:HK/(yi<0J#s Nz- ] &W$ZʍUz| /!7V[/5~a=(9@i@2QӲB Q`rQR0qcM|Oyq `}Q@VǃV( 2`G[p6cHp VϓWw%,ܶMxYI N;>nwm~wu2u1qWCjED)VX;KnCzpC*f7]H_GW1* GeL)ڽX|*#m60Ze 52~2w1`DhJcۖ7) s.;1wϢb t",U@k6֜ÿ?Hcy!gѝa,X#5Ѐ7o2SH*gUnKo"Zz7)-Ia. f\M:[P0sɺ6Z43;F/֒~X]˹1,6_.nj!f=0pS6@Sv`)jY:ZtkH繨)UFCon0-g p%Fj d88XÛ7谡 ?J׋G\e`'!w?! dw.+Yjypx囲~׽e4?'L/DeEz lZ.V_a 7X[tX)uCw!lNdqO_a:~rz@ޜizv)#ٵKYxvMIt8Bܟj7. 0n@mqn틖ug9 -U2n1mYcd(WO| T'%a@T rz$ 勯PrMH@%W]^.s =WW?_k_Fwaf}qwW:euDW_me;frʡQsAڜc5Õ+ RΩ休joݝR\MPL'3}C$E #~u FJfmUsbmP eچe/$mM E(2wDʓwk=-+WcmniŘc,i\&XfɼT˪H%!yHQ-DG'c񀼏F~k!soշB70'g]-ٴ}riz|?o '=kzVG{ ]Kx=.ۣ'lolD'/$n%9xU6oV7YW~uzrT'XND^I{os"s4Ɨl,c%{Qs`~ݣF˵.f7|(nwx2'˫ծ=ΕtǸMsShox]u ƥiAlhV8;CBu̻k?P%t_<~Xdž~=6/.#' qرs3jP`EƱ؞`xNPw{d\sO%&qF_|qtWg~t M\/g_ov+@qvu jLPF uRJ#A*=2(7OgJD3W%>JzÙT&RDv I1Ѿ\pq̖,GoY\2z%Ma3Ӱ> NPbj嗀ϓL` Y&3*9'7|ѷ{Ӵ&2vy||LB@VRMӼS(c>4E3Y)q%xq"wmm#& ^ $كuyB_%)RKR`)RHݲF,q5=UUU_a{-"dwYlg[ʠ;tanx:[޿yFRMп٧]uzGyČ[F>S#(A "94` ܡ"tJh%_O#bADyi㩫|tkyM2jzW"Ͳv;HWޙ(ʯ:yl2T_v2y\9 SeſE%{! љ.RIRIу,IK8?2I04(,MIAԓ8 DjP}(rbT1⋆ K #gǜԩ>c}إ!2ĝ|:lVxn Vhc3*{1?]ܳ?FgF/o=5e<4, zI8gR,0 \ď5,BvʞԕhBTw9wKV;t2յ'+_/8|'Zv4o+z3y`ʤpq | pe>Cej/Oxl<#?Oۯwj\VVw?;_kϳoĿA>\6? 齀^@Kzi݈G:HRF+(nЇ,xh^Wh\h0CD>ADKLwhE Gي:3$ Γt'2V\QGJDvgR2n!e1 -U-ʬC; "h_ )o |nHIn5iGCWo-ySMN J0E4p Z܋Jo}5Q ,J)'60Kn1ɭ2=? 0ܾ4zqGp$#!9<9CAyQTJ4.(gQjYͨ/wΨ`rO:8 )6H: x hZEW^JsFQQtԄ$ˆ$:o0FO C!hSRUQ؟~oFWPBkbH;敧2R'gn(Fi*IWE,/?QmEݷP1(}#;vZoL;;v?x |6͍fB b`K%!gpA#EQ%gM/m""H4V_+WE>]XHϻ MQJN  kh2D (IT:!IFrw x1VǮHG1W:La0/1z1p@xKUR5\?]ڐ'˄>䑒MVY*rHdI)DM8r<mO‚OQb@)Zc6BHQ EpQpS4=470KQc~L~HvQLfP,Vp!8nHI&j4&|؏]i.ˣ{ 5,1n_n#x; (}fFg erSllJ L\[>Rq7s6|BytNnW䆏 C\W܍O3q@3Dr ~}:|?mHU•jW3^J*& gP]8rm&4G^^q[H_R5z;x^?0?W/6^7ζ,WĜq ONkڮH?]khקvZQͤTL t6 kƴO,oPCOj+W1rgWsφ[EuVFnum=+@zrfl4.e$O,}L}b+ r$dQWzCAA2Du1p~o_qg6di'pc^tUq\SKRe^^[)>F^qzjo󷃟.>E_}[vEq va$l3 |3DMշשb_|,*D !vM>qtggv? ^ P)$pʅ`s2EG9&p-h>waWq :treZS@VGlUr{-5nXh;Syܬ~aNC`.IȢBӒIQ@ԑ1l'1::㭗~ P^Y]Mꞁ0w]wf%wTt_6~R-H+a]wt2Ki^5LYc+JUNsb:wpw淪o':Ot5޽[%_ӡ8}ɟ%_נ_L1u 94H9}o)uZ0#ub§dT:%B Z/ޔGLg/c4t3ۏv1sC迂QF\G3Q LـQHGICPtLBmFȼ5t^1$:^@:ByR t$Ņ* WeRGD.^l!^YiZt]nh;nfPWf̊G LB…m )($#) TJ S%􊧌L秡W=ň|/b.FrbRAp }-Ej01tQ2J{vCw7xc^{Y^*ij77mgƽkÈTyuNBCCkpBYP]dr\HSF9!Բث}b"c1>l<;=1cYߏ'dz+ 2џM9_ X J^T0ni}`v"' BG8G |Mb_}53ɀQV\R&z:|8:|}12YCi#*K?FepתT)>s1!>;T ™ 8󕦨TS(2jd/Fjtue3}sX糣K8j?2T~8̭4[+gW:Rr&J(tډX2#Ӵu0|r~3@Z#ތ)m. E;_r *7(71)6 !9,f> w/}o ֕IJ [4/2qu#u:sf~q[.=~ٚ5'#HCF͹;8;Ay< -=T54ǿeafh>0 ú{ E~xxvǘ<ܓ.|:lv/IB/X,Ps7>w ݄1[(Y=DvG--̈́杲@&D<yAJ9nR1o[qPkQhSH 4)mK6EﳎgT谭@ Pmk5^P\sػ*W?R}Fg֟kJTkB]PG-LӮHU3ZE%ִm}\Ь=iy]z# ^ZhrLD5d>%C=ľ^aN}vSoA:()4I$)/0 b y 71׉|Iz,#rӇc?a y;]ti"9ŝKYy[?vu+!N橀ŕ Uj]gm$ GX JD<ĝяJVg6"1w? 4N}CDrMn,MqiQK)Rg-ךROmZwem$Izݞ ۋi쌻myS$v/odIQIɤDeẌ"+0;c$:#qGU+"mB:wxnJD!sAT<j[c2oF)G-GI>ph-2["%T@F7gj.Bza~4kuz4ϣqNB3r: Õ'\8!Yʜ?CT"Ƥ VsJVQDNهЫѧ[tDql !/fhͧ%x5n/̊ej:E02ɭq4Muw|"P{_F'|7O|^D#G!]]_PQȸzP xw!AyuC?㞚5NC(w=ه:%خ}lpvr]<*&_Z6I v<&zibQooqR&4:ErTY-o7Ni"wJZS SWmvkq ӳ7~$9`*Q얍y/]w盓Kӭ{WU&і->z42 ]G\C)r|•Qo钽ƽ=~\P#εC~[݇vڞhkX;m ,||a!2#]E@ uT.@8!Ev6=::DS3SqlhPd>j8>+W`#oUIVVxQ νh): MRX%fa:z)K) ""Jb2tvӿAe 15LطQKFbK !%:J'\:2# XVU CQ+ E4eb\pSs"8,"t٪&K?IL9UTjé˫=Cޢ-XH$RT dIu^hC[=WX )ST3JR9E#C┡^{-TH!A[``*Қ9krX.,&BYXN2akv"j/S?/~*;>WIQ$EUtVčFh!uQĘLIeDKq4dgFs,tK%8 A1TBߎ L#G.H(Eۍa<]L:ڪVģeZT!J .$&'FktyQ%J&oT;-(Ȑz)D]"EDs-GbHGGa b<k~у++Sшc_ kDiN#nxˁ/ځ+,*;M\:o%#J>S2)cQhD*@8 $( r2]p^CsѓfGZE3D'5b1rֈ?DdC8]69bR^4bI< +X,^UuuяOh$яM$rӁBm1*+`\WsZENsDyb[۾O'-D[!,#%CXn y"R `D/GK.4JQt')c$pP8q6V;E!2a;)Jb䴌W\jӯ!e+4.uo8ܮNGl -oCt: b+J&r8W!q$Q '9`)l%Q] Ϡ(ku ^:-!JkFM:(M\XZ{U  /b@jW/C-SZ,v0V9|m\pTL-q)oť\H$2-c,tU[Y9MF0^4& ,$g+F4bI !>i\hҼ;]E_W ^յ^ƣwOZL@NC7Q3m3l _C⾿)b^7PY[0wI✻ !UcS @uɅjSw[j Jo-nwj Xeߚ`Lo׃Px{QCMIJ! ]~ < š *{y:r<xͱJμj|FNW⼕48E{j>?c+)]5xz2.zrGo5Mܮ'6chzg/AܳޟQ#-F'O&Χ,~Ջ @NBekͬ1YGfmu6AZt=-٦]vDgv2c8NGGZZhL`Vh)P:=G*ZKـE:=,g:i,.Z*z0v MNL:& ݉`:${.g{b8zvw]Nzjf*~4 SNXOlmwMm| 4O! N_q/Wޅ7WsI<\)q~aԮź<ʕiG^Cyʻr/N:xX_aԩAsĕ좭+P{Yz ~z;/{LrZc_>ַ,hXV X#?i=>Կ^Z,St|=}q/MFW^Xme=}'Ȗ+Y%<˻q؇uy{CLQKFO<B:c>B@0ntwͳVdm.?ǔ^kpg@%2pWbRIM]_ІT$^ W.VϘ7i 4 ɼX1gpkO{ybRȊ:g!,!d_mgp^".$W雫{W/}02mKqBsڊ|.I B%qt}{՚Ć͏)6;ն_'Nj*l4^8mY\0"βǵu"K\5hzM8H;XEѸOg/ ƍ?nEq|4|=#Pךisgr=@QK1^ׂ޺qpSy%޽auW={g?{kc84SJqu[%P?_~jٕeVRnE^QVT},cjnG\aU5RDdPJ]zQڿlQx)5Z=& fss$5mB2)󊭜vG,7:1P>)_{ pLNGQErFE,/Rtk^\9 TC3l\z)wmmpKg&>tttYr$I? JdIu·Vc. besAIֹM/F=;pQ!`zA|MnߌXR.AgQUp<K9 s7P1-(U;B&6*aKeៗw_sd_x c6{eÃ#WJÚ+ k4ÚLEQDJ)\H^%acX\RkP"9WJ P3GsΣfMLMΣ(HmXccaiݼ;xwܦV¨~;nLsf]ha=b}7s6W&A$cS\&3 `~71Il̠ӫ͡1MWWFj^fPLTz9~.!g̏K2\RHyѵv C*y,DeHX!IK$sPL )l"T@N'bL]6řMDK@؇))cOM*L+[@PH0wC&=To]9Sщ/GTt@~dyFaF9}dNy='uJEԂXmjg*M߼Cƒ4eY c@O&3I)OQU'HR.y\hbHgIkNl~jL,0ߔoiYF̕[/_h-sY]6*\\JLXBĤ h2Gf/n|Q*zp: )X|뜔,J$Yf>FIx K0I(È1-4I/(ى/YP:Ö0#h>bD%C9ҰKO,R)}8 Kjyd45CLzAܛ45/Pfp,!#wqK ٿL! B{XlHxY-Thf(zY(g*e5!_P_98`̲lbFGhdXNꤡԟDe35bo esA!7%,YVUp{t2H.ߍ}]$1 q,\%E('D.onjaz>U.=c )ѿpIiZ/r^`M-ïDąS*lQ#r,!qJI REh KO]B Gvޡa{ b_ _z! bKgb  ;,)ÆY/j%#|bj4u}Ǭd c%fK~c;'fXE5G`;j9bDvl{3O 4| ,Vv=F\ A#e4,}VĤ7YZ/3a e^bV{-3*%a7UTB__L0ǨH" %C:i)="Rk1^D{4oqZC#E[=$3uL:?f'.o$+Ŵ%(̕LABʠJ8p܄/j*/Мڱ(ׅ,̠Ls?V|*l҆eCV醊sI ,@Z2ι$sI3ArT(0I "`j)yq@0PdV]aBPL+xth4:r׃s;I}7xPw#%E.7_M$kcjfT AEO8(s,Qf($9e9IKfIR"AL#n,l{/w$G׼Uq|<3DfL#;Vs@;ڠtIoۜ^ka/[|,ղ=j|P: &'(@MZ:Ɋ/IIiDSNd440A -!,-+`CaXJ' ɉ/|u@>GвnbPTdN᫓:AęNS IkzjYMȘ2@,H@Ѧe 0D^F `uF6iܘܗٲ sXW>fVw6#8)Y%<`@-}Ǭ $*`(%ZֵC:B{Yݘ$Ʋ.gv,e!7qR'xX6 4ꅖ c%fm|HU[%>P;r}bDaOys$uYm>WlaeQ-Uc@ ,C0S/!e##eXŁ!);\,B:3l"ST!͘/Ս1!y/Z&!˒يjBH6ApH$yҾƑZ{[Φ#3 )JX9/9XZ`s3nM] FTvG:@Ytᾜ[AeS7I2Z>f g Js@,R@I%> 37Q]:02CUݧeo_'e1#zX, )NJ0潉\UrG{ =T,Kobm QO]^ !r~񿎬~43v_ǍZwtۺ ˶0+QKwG犃Μ+JB,Z=Z ;f$)8saжLndd| ˌ2F 2]E1ɾ^-fInroy1{ЈB(_10CE{9O)Rީb1mvNEOD{5PKWx `lbl3]JDݯ ܡm9t!*ؖA9Yd7og27m#l!hu uˍnpdY|_BH%Nate[Ī3 z`8 1pS!q^>ՈU60oV(2%U6UJjU%/EvΆMn/=+.`,~pŌr4f1If}2Y Roѧ|yohzGF-B>QsNFsEU1?4c@4)b$)$ʂc^[2rP:P-#:Q[+sʕɬ` &꒑JFO ϼFp (7գr6*vІ8zAx a&xH%TɐF/V1FzH vHf'cjkm[fۥ.W&kwdz&{(0&axj?lEff[-Qyl%?>Z:/k y kY+ ؑY1[*s-?䑎b`f'3Rg S3]57]o<<1IUPr 6GyI͓8=Yy>eCh' u6[I2 :'8G6S@حq,uFr;h R3*d!u4W 9U dpV_(e&(`c&KBW*ULzIF3w<7{3ђn&&:[\ ?~MBfm~Q}аTRf1O>5Dޱa!Dm%O}s4JRkaHQBx*, Cw}썻hYvn],MSFYxLif`uȁ=FTݠB )1ܒz(cf0(܏:!kVP<" 'FqظhF?{:+vMD8,%%Rj9'i|QO;-b r(OuC0tOg8݇M~h’>sYC''0O}8y{;DF\1OC;81=!p3y=uw;QWMJ; N͜8w(rk_هF,$5$qqCE$fӒ&0@+u&_KI)B&L5oD=w5jĦI8|mD1މ J>DD]67ɴɑY@g3iObR|1n Df}eNtSNa&[WlO . iF*{gG[!@Y+nc'1WT0Vl$z[ˁUν*K}=zlζ.ׂ, "{bj:C`AP"e 4MxvA}!BHK)E>IȪ(?aXmz' 98yBPx' eyl6V^2tɸ-GX84t{f9n׸Mmi<V!{e,MR 7cP-0I;d >ERN0Dɮw66 nurÁZ1nDϓ41 M@v= 48)愴$UfcX2F4~D SO iV0FhU {\Гq̡伛LN~d @|}@K<8id?ӊн]FWHg~bSyä{d7S%\,ѵGY%2)ٖ=X`kH~;Wɡd \N3 (A@T (⑔2uka \+"[Ѷa=rNKEb&ʶW|m$2VLh *XcTlRaR\+\(՛;v0L4K)D0vjBk]+" Hl[dz* ŽcE Kn>!$8@JH1)X \e+T. CuBˌZp6piH"j()Nb""x @;u`[R^uV&'4Nh5*hB[h6;HI5rݮ<բb.qIRhdԻ",33`]| P LAR[U{ #UFꞹ{tܜWO1]9`HdiwR cNK!ldIL5|6@dV4w>"1AVSsĒݱͭ[Mjl"^S(.':WY\ (|&6vŠh@uwKi8 /)BTcfJ0bI#TRO[qΫf_^qu#WElYϬPYɶأVE+6ƕ.Q8|W@rSˮº|jju݁tJ(Y$ CO.Y-2+rE<]C^tҧVy[^-lT'J6^/6=n~*lNP! !UM}y*1(Wq7`[ưعGgX&P%i #בisJ\,!?\Bpb p#ݻOSs\Pn]a(TϳrܟxݧebAYU@8TCQcU2L} J*ܜϪ "ϣAymRI>^Kv35·T1=hg0e/Mi/so%`<>׵ w@6+pmԮhx_ڽ{xKRf  Jʛ[>S5HfpQ#b)"8GӂB `ؘDۻۮUT6s&1~ڇ+^{oЅpEDSSX4:s5`3hVJ^B6P{QNSZ(:|n渨-%&n5w9T3px Yt֦r,WA(T(;i6aNs8{dj~ )HyEBq9t6<)}~\ׇ$$Sr6Mfo^D*u Beb!gI0cL[Ad7y:/u Dd;2@mx5'騚 C¤avY w$gP/ )^S9އ:ABݯp]`aLd4"I~~^wzw,AE5>DR}W|8RCM+g-]Vx]7]ts {P$ -Zዄ*BP?6 6^·4&%d??~c8{” wvf ^^?`O Xp4JLu! 3) 'fg]Ș)g2X?|B fFtnNӯ/+L10U>܂(MsKZy(p֡ bڸ$>k[!Y2识y#1ƥ]P]d1kneQc7..#E$P hflV&k,/c)m 7ruFIFܝyc R0'@E<PI 3&gLlz$R3Չz-鰻 5OLf: AF)8 $H3- <6h/mb#2F}Ue[^>qIKJQLDSƔ{VJn ܈wf:Qo=#~:INT:B)c W2y'QbM``|\Ң*sبJ ڷүջ8r=='5$`bN31h3 Z3AJEbBs %1?)\xDb ?dj.H~cxfmI[N=7yfzA 9PqQ;"%>qR'z񨅣X&!huxXDh3AAQUj!WZDe'Gn~3EUiS-6,[^EpSE*gD# [SEfz[])TL $C lH), k sUK7vݣE65 |jwZ6%zc:_:TR45לԒ7ιP r3'T`/?8ir?KV hB5kHNBJ٪XX׿cj9hf۝).|0q!%FfSmU*5H$Ոe"{U-68NtF30Ǣ$sv/Np8XR\Nrd uSe}]QrY맓M\G7ciS)sW 8PQ#{9ɐK"LA"G^>X_Tml׮t?u2>8mX5Rhx7>O 6{ mbqjSalOEj@,qX|FKTǓd"*moќKgܪBs}Þ E2i/o/D1"J^tkjMӬ msxG=\kꬋwpxuvf͟{{g-YgO]Myj{B}g#v-^Sol>Gr̗zqo$-KgfojtAaVXE VC5#w6zלhef[*/0{iH  <ʹ67!Am0 Gԏ'#&RdJj{6*jΧ 3لyt+{HPһd'pX$#q@^5nHǩVj*VWn:~cXC}i\쑊=nC#|RT5⦑~.kU-L6z{);ʤi{RrJ\oLl,՟,_E|V̽{T(=g:٧%NӔ`0w$Ms#A!D(WP$ Q.ob< m"9NT sd{H1Z7f׌CBE7M9s(mf<ϝƖW )7"e Ӷ|FƀN2!m5 ] 3*:"~E)! diH10|{?xNBCf̶CY!,ʽK3[LJGϫxYtS!$PPLX؎\8P!?$ f}>'S&<p_<-46֜h&oǜmjD`/sr`}JmI)Ge)[\k=&3h:腓'/-MRjz$ J Fdl;$J>ԁ*[8($Ĝ+#k lL_s8F:g<`_6c0'etns89 b nTs18bF)l^MzAov^?J*UT:?%_~B$_b-a1?aox0&2&w*wN>^irYMRo J>&_VZF~[ Id hEX: ٽO,"' # tZG2ӀU|-â˵390 rXʨBWԊjwЯX!7_`oQ/? &S/SIfqJia hi)*LV%^$6DdO~-h]tˍSN! !ԉ [EuS7 ٢n^ u7`6/[ϐ4D{G}W"ppGT=tLSDƸѐ.혗pZHD9%2oAS|1U; gMM83i`ߛ̃2) ~]FpsLQ]F \EuV4#ڄ/[o0vcX[#:Rjps@A-OM{-zV Yƞ34%`ql @2b ɾoŧJnqp4;t=6W!q`?h@t 2!2|4yfc69q`c{-kL Aɾ@1ա|u2.&U>>c"3L:&`0pCXi$&D:G%2VGee cTvdPȳv3Kxf9(e{sVxf M;@$%?q[̍2ܣyo!nѲ.vz0&ې5-g??uiBRMŹGc )R&6`Z:n S`jew#q.г meG8E#NUXI0G), } <p(J%Ee1^= fz4Ω .<5H8՜*!2Pue?EEF_(6'+~f/&em&l]/1.ʔb"BdɓMEZ M89i`,t{1XA7zL2!jpq&i,P0N=8KQfJݰoW4(#$lB/ZQ~b~)2Jd\y3v~4Yv/+$ QVKgts,e9+uV63oDAeB:m)}9sxDئDF75x.eWLe.hyBs遪Urc2?Kd\^n+ͥW6p͔Mכw> 7˯0n7/ ͖s8;/;eJ'VR4˳yD03),%[H2Vٛay"5EBE}!hW.20x@剗q$[/MҰyQmTd,C5 (DE:۬i8 Wze9C-8܆SR9{_Nwp-ӾR:In- _ pH+7eΡ`p K ~6s4'LGO5/SBSWly%K݇ ʔ|)~¯?o6f@?b ѣ毦T QshC+R% 1tڂ?;H i̎S!GVˣs$Fه~eO\+ ~l7OaYyMMiں-h7 B&1!vfĿZ8ǡA^ذ8 #I†t7`&=?M`v,.{b*fz9IS2gJD)JW KƃO7ɗd4?B_Ϋvvv\ͼN{QV1٘$]hɶe`z8q^>M{'=?{?&ndˑt3XKzO҇oJFy:?q~riH:!W iȸ-vn|e2ׁ@}3MΎ^0abbB}=3IN`.;V/P|;zY*rh _;sD u%Fj{S\}جjA˄>g^Ǩ٭,\ ~W,\9E3gL*=‚L(|3s) CUG(N3f,+5:Eb1J '>I`k|hJ3{Yo?nE;L V y,*ȼ" =7?t6ؐc+;z7l;$&ʾܖlER s,-؆U7tnQu?amT>NQNwh=|}@ӅDF1nm^}O><1еMЖO[TnҡQ-&Gx>tw_*DF-F0oHu!=PE&dv"#ܑ43ݰ]KgwV!QFD<ꖨO 6M\+g >ZE̼#$yҊ  l ?o|JI> ڌf~Pĵ}qJT'ш3JOҕ b?PT|Tqʶ~?faUv@3F^ktlLzC0@L>Z|4_vL`N ѲPgǺE$a Ӯ&Hx H֭ N1Iv'ivJ|p,E[p1GX[SʺdH]` 9s6ܙ>=c{3QDhʄ\RF\$=EEj_Xa{׷*d%8;rM_@ 0ESJ+n]TQayy,v#bo 4Q|otVsOyfh>|9a¨nd4-d: <}s/5W`_&_,^c6լΥ6Hj0`dL̾ 8:pV_k 5NZhn^N6Uq8Z@fג|YL͙3`4*Ā[gd78[a|gm#Y`0 .WOV+(xM`ΙFMk!pe*>w-hTiiwYL[5 `< gy@޸Œ*ȭYIam 0ሦh}U]2#/h@7*+6 ja9GppCk {rKpHiTm\q$C!VF^Q n"{/%a0q6/Jd\ou9k=RP٘9Kg~1JvII´,_½Th$cȸ Ei 7a]/TS/o翐wL%ui;a8tʾ|-vtX-*7j`Qt3?i9 A ;No!1RjSRo-ZX=HyƤ٬{H1 Ɉqh%n7aGѼ:b ؂4}8 )-Vl]o#9rWr|?pdO GkF؞9GKf[nqeyX$X*VU^aXрVF 3Ny|0#5E_1\5O qI1]WLF}͠u\4\UEJZ\Tϟ7\t6}-@GڦDՂwi5tEK"̣t>_Ғ_rǃ)L$`:Hd.U_,Ori6e-. 1C{OSӓe B,(e\;lxNB`8%a(?|n||7v SVBW&5F #|A=ѹَcG$x Szyga.78jnR6*Kp}쏘bߙA]WLCN&jQ#zuo /Hƅ_++/i1'Ifakh ӗV?]ŪI"mWCcwGԽ|1M̔A\b\]ȼHRi!jJ/;x"!JܪsCX0X8-Jam9vK1?6$΃KIh$WCc3v#|i똈Qn%[C$OB &#E؎{UE X_1~k⃫QCҐ4J_ ˽Ҹt)+kS\<YqfqЧƛ%wIr +AT0`yZiƈ ֔ ɮj~xlѠN ͂StGyЫS:i:V Ԙ\. 3Wֺ[8p4XQg` l`m!sjhTF epӣϾ1 }8WK< &J.$0L,Jgi*S ]ʙ>S:j)c7eg WG!к^\nkW,pc ZNoY^u54UYEzlMrXikz֧;aʹAW࿆ư_@"QX^B vgbzl^6g앹_B`n5] T-8_'llW#'>L;Q,/˛gYh\ Lkd $O@xde`Srʹ>j[LNQj#i- +6 ?8Jӛ= + hT8| VX;DOQyFoCZ x1!^ Atl<:#Q1[ a¨.g [;yA^]9+ۗ-ٵOb9橋C|-[njӮ5{|~kؙ,G"^zFpOXVr)hxNj\P|k*هl\rUYi@Zŝ7rj$Kfo诠FMgfn4nqY'f_=&@Z(etV ^,o<9yb27mV@կߊ/˜PsSfmPԿ nk0>Q6Ea/Ƿv9FN-bFLdZ% 7vU F :v=oxE5?X7I_1vm*76oN!{LK/m@3)x6m3tآ˟ b  IAņZ b UI]%G'ҭlT(!i2ǞYb h7B%ȳ6lW!9vYӈCр hM7 x>ǘp C`AWm_`` O잧c`a_ euTm/,P] ^[9q(^x$ }Ƣ2؞yex,=HULtv/˥p$/1").7<.%qub+6**MUGt>_;}@)JLQ0u%_<4#jWo$NbU'1%0ҫ>lm~l Ni$_WlQU۴dm]Qy)?J2h:/bjkk˙qr=`ض'>74{X*2F1#y(6&)rݯ!Q'Dl'|9Y'f4,gp0rRiQDH'7YZ a*B GFOD JBUʱMTP %Щ:]$ʤvih~ NwOjtZaum]m CJym=HjmY%ńQ"Ufɿ)}] Ѣ#2MEc K ]0xT#ʹs(,us %4}6);x=кj>#T ٳxzmc.E+\r?r@/lzK M &C#ti/͢4a<[$,61s< qeBWJs&q `K~cJ`e-.BIEl9Xjة +@zbLX [b #k9ZT=JA/H3\!/+gʹؕ,ZP.{w{veR$LEF +Gf{ˮ,Fҋz7i;) 74qZ1Tr(.$\H5!L*t\|AzC 0ejVh Z,IݴK13:K@1]瀃j)+aƘb<$C՛ !}C$ K*U\<ExQG:dqzҠ)& 5$!$٧VvQh&K%)xLS P@" +ITHS"Cb[^,Me ˕*IϔJ+Shf:a}vo e 3U\&vJEK 4or&zM!)kY(!.pipMVoJxTKؤ[O]]CǖSꦰ9(H(LV6E9s$@{?ElxϥsD]LվU-8dⓛx ؐ2LZI}w ~,ĄY8,Y'_[FrY-oT T&6Gi%H4qR=t3 ׌Q.&!+kM߂+N?+jF,bKl!#ɾ/aWqm|gۈj7 CTېbn@4+uԄ c1qD6Qgg+-: ܒvS'ϟlsϧ%Ai}V9-8 Pa9P>"59V$iiRyy"=5|4:J RvI B)n~ܼ|eS7]ZզGGWCcuQnF8_z7zϗQ:kxG F_Ul Ny$r |3-rWjdvuu:m+˫)vb8$ލO4˞~7il-os$>[˔Vd2 &N`)$#49Ebք hΩ"##9\J)V_f}ْ &,Z1+V2:),JPMXؠ]gyy5Qsq9-.w7 -gppH NPYN0:JPJ+([\>T*pDP%LJbv7dr 3~,LaoZ}aZ*%K\W%t6_OM: w oDuYDξL."l2>':@^4{(7* iCs4\SSxK\RrJ)H(&8 Z)w8|nɭ}'U5Ɓѽt[M[PX/ɥ}\ 뗟Wg!Vlɹ[FtZWH ڥ㭠愆敍F2ĔJ#'Ahk\ AQg #dž @5ȱZv89n>+x] lԶVK+ɽu%),C*޿[G ڦVgBfG=es5,0PO $= ޠ4 q]Z3i脸yRä<䌋lLYo &"PMS3LsIz; ȗݕv,V ꠬"m=XAպڶ-!}[F$06Q4e>[kvPKp`zWcY 7Ps6Mw*xmA/s:.\?G@?P|/q8q%jsϨgEˑ&Ws@3.A Vϧq4"ZUP i ԱO veS0u.ɺ#+I79BfK)@w:?*1nG-C&J 7P# Ά§}oRX`6xS pTUQPy\x2凇If;Nĝq?>8M._sj sjmbe 1"q ׁlW+nM顸U K[(rqS{!aIGה翅|N88(զU~'MR]w<6v(Fhk[xS|x~,S?s/JlC6y%ʒ%1` jӵ%@&)c6@M듦I[G-|9\ X.`2tr`KfmaKI5 gMc1wUaPp<+{\utRVOk"McvmOEW(}>UJGT4@JBKSƪ) /erړ$Fs뀚EEvh=(#86? hĚҦܺrn8l~3Q&ISn*2 Q! n$((vNޙy["=8 _pxΖVnz8p[oٮsS\-Nth6?i0_7:zc6ds|l 7 413fsz"Fj,p3y}UF^ gr223F:f/#i>4!$~)r:ﯝ`?\^2#6|VB*8({Y@]n}QgkiK{Kkx㛯'ٙڇ*1]*/mxLTW/1rҫW)o/AkLy"kj+ ZYE2Zck_%j&Pg[_if8 xb!l!ьcY%=gY3QqQVc75*WAڵNN&.[>> _c'@4qJcMV]u9f˲׬~Gx,F+^hqyvggջ 0N ma4fx4Bk O%ṍdh4΁L8o8({oGjHpvԯtrb'ty~wJs^V7)6:R#}&E$-3JK"qDJ8*bn࿤R\"3t-l:R\xi^Vٌȓ[$X>%sÆz7=PG; &uB"V f?vKZa(vX\ Lmt.F/t"+FŤ}HbYVu}"u\]}IDv  5kjõ,}/pu3:hG 4"c(7Mx}v04CؾU#8iFzVe&Մunomr,u4cebq8:tsUR*gR34{u2e顱 auG{H$p쟐a,-MQcWSz`w]b$#X  Wx;8:w`fiTc߷ჾM+p0 櫌$5UpZ*cwS(R\ [#Y Fm*z ^dF>j)wRjd3㗗w jjeRv@̓4~ZB8?qv9N5 ƹwH9>L_4j}4ډqf;߰Tf.~(7eݩ<1uh^;Fcӷ J0Rrq[@J`8[g&e>44^Ay!9'[(\]f}hz5X_[T,X%c ,eZո׮$5+]M-0eN]H543o|١ӷ+&R>11"L R`R;Fc?eGJS_mu6xU/ꂢy~=0R ǪyǪy^cBh!M(n5p/,*i%>MEJHP=0}Ǡ[;TldU))K >$+ KcU5,] V]HUUQy~գ$P8+BA/0FachV|`@fSXr[ FR6)ǯge\9 ϓrrv$c\'(Tht~.O5~ ?~R]%ɲJcAъB QŁ3VqP9%UœMc΋*8%/K5OhDۂJ l-J]1*i JC$oUEYd 8Y?'3jss{sO)c* dq$'cfX`c~@)rآxiĕHI Y]MJ*W?{>fzu%#ބI{(#؂jrxO戱/_fL9vE$"pd{vA%}ihmd Y*e3=q}7BD:حRg}v޲(-ׂ@%B`*{%%rL֛J+sZV=ȭoh+JSWr-H'qC@2~+J֋>++h:z.nsCm\No%HmNp)o=VhC(bH$m"H<'QƣM>fdq#&%hsc-Z_ZYk1WP2JPj\11hKbk/T/Qj#(3Ӫ'Rr$3l#i"ԪN`KRBrv^ UFcbS S zc89KYRD50vKPh:A96 j S$ /^mXvF3ZNJjdbb(s3yqu[{[~0lICO|{2fٿ^3g> yi촞xZ'u 8znlh(;,`D[,xMv[[Zƶ3hK[B{iE/4NE)r(UKn`,ϻNgӏ:*Z/ {Ql[ " v&^{~aٚv̜Z-=zr_Pe-鶓YjY(T(3Yw}!:Nu%2,]B}-zu5M +Q"H+|RɃ c+8z|XR&gءM@뾓OB@bal_v?a L JOrA50n U.SWdynjX+PMD3W3WzzZa8\ SԚȭ9do izةv|s-+6FVGx70_ S;n]"" 9hqfg Wvdb3 P5YD!!Wb Lv,d"kϋ`EPؼwk!  4* c,LpdXE23-9@˚?eBa-ӹ"x\ۖ2kanဟN&\}$ABa-(FD<pnu=}SOvhK8D$I=X?61i曐I*[lJ!}]czD!75:S)8OW)}n:t[ !uC=gA)>wՐCL=ygORu ' (viԓV/l$zU{Zz>Й f4&n4uƨ LP 6Ө{ է{pP@lYnIfnl?lҳv[s|nm0og J_"(@V,KJo\k`ܲsz~mCY/1F2-dU (HSMAtUR td**T֝CVbB'E͆+Ƅ:S VP@PϿeV R%|5Z@eJB؜X[We֭j36XqxfݠXЏ`o6uNKBZUX*`#Oa79I@ꭔwNUUJEH3UUu+6Ķ~ªMDմŎ-̬00d̬lp^eHmП|rh'ҳv gcA\4?enE^\|G} iI@#Ӌ8R]ҚZI?D.Hb ő\sՃe0 [lZؠpd7卥4A$d1'%uai]X㖋c*R-*uZyNByD683F(P"+IQ,kMnm|;ЭW2?tCpo;*0%PR>_*%ٌvl7FL`vm=BVjdG;]OvG(hsWS T; 0tFMU V)lFAᷥwv Se5V,cEq"^Xc䆩'D8(w_&#J:T"ZLy 'JJ嵙S!DwTlȱa~4fLz OJ 3is o1#oVB*h?<[Qxvcծe}5 LJـΡٶJCEV MP7O ϋ6[ȆǾnf26[/*Yi#t^`R)67*?5\DG 3v _o1m5D_ P5HOl[R=VwW F|Jف糉2[(вq 1U#$һ Ƀ!VjWJ%P2,,2)itbLb.I2T2e}[K8<3.|!}֒ᐡE 8ıЙ`>X o)À(Cg/zi"/= #Z`9h3AiLЇ׬X2(:l+02޵ƕmP"=;/ twwP5qd/UاHΩs((ݹX­.M~YX o6j c(ӎꮒ~I%ئ_uRQc+~X(g4ߝFoTMdnql-)Pt0dzWuG+(zMik^nkO9ֈU!DXCqiZ!ĬvhmfUmx>琋«Aj#. $+.<T/x{?Js^g=Qn|a؃.zmPa?Am.ɐBn󤨭 'ʯWFǐ SǔD3}.:7_zqSh+U %Ts[,J.6z`8w C 1 *vJ=siGPQf!`sw45It4'Y_d>K!}(?, 'hƧA G9Q}buJInBmHLbZ+bs+P-rl5NwY՜Vwy2t~VXΡK](T9r1BbJqu:8TF{.ؼ'CNJ$O6Pc]tK;/=lCGپ8(ڤyMѶi:C<^}.'>C;-x2*MdDMP(B#'16d(9#0VSQ\ SKwTz-ԜCcƼGlq܄rL1*Ho|5 *u\m'*;km%%*VZHlse&W: Pf9hN`iш6eT0F5mHc$!+LI7yұ8HtfkQ3i|ͥcC@svxR>o:FJ{>t":g@@T\Y͓@ޢxe)S$ʸt1ծ=pn6Ν-Tk$|b7UnʩxLlԊA:1TzRV,)ζ4e$zriVq0|3)Ĥ"YR(+gpW/ *h s^KңJg<dY=䦅Rj|O=PҎjRJ \u@@%겔yNS@:mWs ] g'&7;}']gXaK;. 'cv鋪2ݎs5!%NQՒ/C6Ct.QF|)3bs'H\ͦ۳CsMAM5DHd(|(]Q} C,J ~Qv;|Yt"JlQӾC$ŋW0"ϔB ^Ȼl[oeI\R(/H,-ʓ)H‚,0G\IY$a 9n$N%*Tb^KnZXZMlRyrA7UxE %R:5% ƘT2R&}֔𦝟vycCH !s*,qgR{{Lk >}>2d=&rR[ U*4V( -d ~yϭC٨\ձBÓuvWMTC[^ϮG}|ܲ>|c^qpDL?|/q}oWwyϮc$zuƁ \DYW[6_Fq=G .~x|vfά3m8߼ONwrFvU/8\ߦ_J;iW!:_e `L %^2c Q>qpTK|O bPϫe{YnkٖӋ@mOr"f^*W)bC^ncUܳ9ɳ#fTf88`VRz”BSMW—ZFT [xt${Y|Me<ǟb7VDKSgDVK|rO*؝݄g+ q_? 03?Δ7@/!GsR)y`b ~ v=`.^$TFj l#Z][{KJT{&9ef=r{9=N'Y;,ZrMTC&6T.$f}81x~| )1.nk&Fqol8:!us+$+j`%{rozpԥ3K'>0@Qmc5e5qޚ{Y᭹7M0.zojXJL?\HPe]cLgѷ(wu 36y*Q [.z໨E1,Q%þ1mAAsc7ǯ_ooN숿f{r*|EVGak*M< [U`cب-7{ׯ: :mhL{jp+H:Hx }4/@0;X9]U k,ok(s6O9υ~7/{hgD;'D( ξυ2oR t)ڟ#k%"xkÅ,~ѳ㟠IǞCf9鵸9轂P".af%IC-4Ns[A!2FrQLBVBbˇ(P4R!Qc02K>,|K9 f,`F΂Yhv*,!* E=Sy[Nzy#>hr,δFdN)UP'wA-t kV1fOqd jxیf6c7܍=Es/"G㯛>QY8\|q-Zʤ7X7)rE).[\-| ȏfۚx|>:E+;Gs:zzM]V"['y`┦`7s41c1 I&/&_p Zj.Rcԫ-w*VG>nleK:so JI!t>:[ˤR;EK`KJ<.Fb$˘[id_ZQk5;YEP pLK(V\T*F%P&]"r{Js=U]3ꪘQWŌ*f1\g*l= =z}`H"bq pXǵjPbTs\ ||ٕsGF|!"]qi^MT#dZ RoPO|&tp>Ɠ!lL<|NM) ;~S}O9@YokK9T͏LA)dK%#&!&wO䛇cu( Ƌn Qy܃nh+C;Y,1vGdBVlp9MJFP̤Ej+KlΒ+lx#ъaߑ4 >Xi# O{z).n腁, ճoZA䫞t1ZL 66q>%_@DW*gKo֟2JE~ЫSH0xChl 'PA׺#lD>k-%ьbL$ ԸkZ޶zr+2/o/XgHFIwV![A&(KĵKF َ&r wѣo!Arxj )7tFKO6孰vD;95T5r?hwsTV9n #uJmlYwr 5 K]m.Ŵ%?om?~n){.:L.v 9V_aWk /"k7e_ooeAK-M! 9?R}ncԿۛAJŎ6c˨S-+PYR2HKD0-%6Gڝ \2\RuI Z.mwvS"mmO_0`TW9B\TB~ڪ`7:#E6f6&t#1:AtmYΖ jb)w+|ٰenT{z}(F-Zrk-Tm#N.4YshԠF55F jm`+؄O;.*_JeaL]:닰CA!riQ!dˊLB*IQJ]5 7N9鰕dLI} YOPk, z)_ْ"'+$\̤jHoyl̅b{u> d 1*Ml2F&{1Ymmy;FX%ıGҢbz ~#MAgp񥍍}k^G;j/؁hDԒP k4=+NצEMPG[_JCo lEGQn갪ڪſ4Rœۤ"Lpcd5h^o h+|(OnkGǣ"<әKYBHG>s 濣ñ[j|nՋTzGG/N邑x=|n/{'ǯ֎_sѫ}_Z^8sՋ.8./֮9=p0qtM}*{^׏}8䐧ApAIGB+`_\t>Ey@Wmz~?0Wk ͝E;x,_]%enW{a[zI)c\Lw#,c[93[vQz|l94$>ϼ7` Ldw\86 TqQE=Fn?yfDېN#9Fsƹ=,3L}-87C>Niw^վ@y!g`ř٭}J;0쾞H;qupJSTF~SI;$;ZYXl=5޽1;ZO]^cf5[r5#^QpֶA 4@xq`O f{, 1SxNy'1y鹟^FgY=:O}ebN{Cyy?1  }rLCoBr5}tВE1"}(Z|?|:՛_M|.\5VAbl:͑26;.\ًE4A84ɜs*>xXh4D4Әs2?* jg0k0אJ8l͜5wPi: (c]sf%Tu@L.^%?$mI҂F(&\h@p@DA0sq߽!E8b'NŠb#QLP]"cL[p8J=7'奎6`}sze~[5F9yOpH&.q"sz۩[%qnFs`M! C))-}J'.0l)z ʆF}BnJ;OXKS=i'l1Nsv476MAݹ9k72ÔvO\aIځ%36~J)n8jӬS:uy6ڬS:u fP2Z/p.L/s=P1n^ƶXgO/3"ODv S!nXqkz^6>zi76~[iGRpؕ6Sڝ⁊v s~NS?/iw/~snn.}^YMSF{KyH(u3ݔvRaf@x,nb:1Pq ݤ N)[7zJ'.0"9I;pfk Mi7g;=<i8ghnee_G/v~y$K;]չ;&dy<`)>l̬XE: P¬]FvS-!K; csvwIߨ_("lMe7̇8p+m796iǯ!YZjڕW~­8bҼemKoNv6džJz7cC>I掘׆XQ-]I%|Kqlh7?;yXp܀Xkƴ2@ſ@L\H&#7t2 ƞ:?~rOO?S9|uFkp* Ői& ]v`QkGw4_׳S-a?n`B as b{ ⴙm/h"H#^7hp>.[]YiJ; c`St7vs%SA}"H; G4s9-M{K"zJ)-:,Ϋw/^/X]cS,O!e_uEF]dk w5gD{c&WyO8'%dKyOg"\#b^{p Aᓃ㓣*F d?<@jRCʡVe?&{)J޵UZVԊԟL~Pmb14#Ž =SKuB&5bcGr:cV"@/[2lz֘}~y{9XG%QҝUԡbˢ,׊/}ӛ`5;7n3ی0$&RCN>íXx:o5t+>}"]#s\X!967St.)#-kvsl4r0/QBlKlkTV]Fk;><+@sx<:o? 7(ThsJ\'1h7AטH"Q@hl6]&YkoNb`4sV1;Dw?AT~F|"|Ď Woף%ըJ׃ީUzM0N0,HgdJA8K%o |$&ZVcw@GUէ*"#'3D}`lmHD!E5ؿ2w&$dOjvmkk=IL/([,[r=XMMHGFTd {[.n(.-1gk(abprEZ戲KRQTgv6hN[q<;}"YI__M 2gM%a`ƴvzL^grN9⩥]'\'H#i8Y"q E8Fo~sy-.ėdiC,^6o&WBN06F]GO ۓ7iūzs2W,Mֿ-;gǷ3U2>)%\/ >7G"+8km<givwҽ~k^^ vv&hߋs/HŅ/n7eu1eq_[@,_fShk-滇OȕFW, ^ґ+Ɉm/ ,ZDs6 $he.yGPƣ >ci}&c۫z+v0%u~vNx, 9Yk|vN \:7E8l$k-=c :]CO9ZܽTr(RԿL_) {3Q>@"mp)Ɂ>EMPt &M.ԭx#?.x oo=p8|1:ylwCCA3$>'9Vqa"Wֲ-" 4PoFd#@"l=''խ7(G*Պ/k7Tk jTx O\z.LШハo])rp|}P^l|S@A0*pŒXs*$9'GpUzIT5P{l. Q%`k ހucXcȾwsbMĊn8D|Zks''eD`pSwg&d>c2%ׂg4Ixls:3s%>oTPg9;"=zaȾ1~|0A$h2by֊NifhB7IOt#f'fikF̡nؾ9ghC;kg{Ap[Wm >DcDHxϻn9Xf`s}Կ5e=7;U![FSv&˵#TD]@_xTe тFRVaHs\~{g"#9z"UW0WwO !Y 9!& XZ\\f/X QAWwms4:RJ:GPPedFUdN&k3'OLR5TS,5!䫎F PECdٯ7O7Yx{pvGB4$ɪPb_THdadͤ1B,k[\?BLBy #fyATIBdI}7t|I UŦR| Y% *4i+6À-tƃX)) ;^b 1!]N*u&OTE2QxZs*W71 ur> 7kBG-dX ^Ug's ^ rhk_jbZn5Xeh}RI !Oa̱}`vW&[ oe=NL5j+1lGvC+mZ>1 Y@e| `dk|))_)n-['(`s}̵4v.gL-)^7K̄&&tR2,Z-@)s9v_N V [B H_zxS *q^Q,W(LްWQ -&8"XbXem9t k# v } $OaRn\ LdEkyX\Pʢ js"aNS/%tjzqtN;2$˟,$o HNUi {H M:2Lu9d2GI(`{4XcqI4V"u$xMīa;^=rgjlH@A{4ݯuЬqPn=u(,X?T/*Dbwbm1(/PB^$ SZC։JD5{Dk?DNl#1H[ EiJɾI,< -_onU}>u^㢜8 {aĢ^4cߎQ*THT& Dc\m#O1lMPR`iSH‬}4E"_+v Uxk NVtFUyN gQhGTI*&JIjƦ1fK~[!?I;xLKEVyޅӦY@O_A$@h[%YYj!r9䝘ul|θ) $WR_I?h'ǒ6m};Ŀ~=Җ#ܳ(sܝ5w2Gg+jFAM%MbUIj8XPQǑD(wY:b;I$'AM 8Tuй inG׼w>Yv4qOtns o9bMwJzR=.\[q.JtkJ16 `DZ*;9z9X &v虦4mMiS:iNHFNKzxś6w'ZkLAҖ@#: Ӻ5IHp48v$9}Yk|IלȈzp9;^5Li6|JgC]5=5VmIm\) iJA =QVWEQĸLq~VQ{Iw6yk-?m6w ZgM8XVӱ prFft4i Lڲ1id!ƖnV.d9=\vr|T>mTH}#(n]N[VmNQ*2~c2ԭ$$S:>4b&Kw$fg{ Hd,0ׇ|I\o{Ez0[6y>>ȪX@Х^ . 'U);UKZcݳGmrYoK 4nCk%Ɂ>.g$=v 2GRL.RvJڣu2Pvu6G!~Ȥ;ߖ&m%ɡ>;1 }ni;UL^]y-<iVAξJ}zP *֩YlبhD^}|#6Cܫl7M`kEJW_yi>Xזrꒂc=Y)Y{kZy6MJ0`UtK`Oy+Ɂ>;1ф_<}CyI,#FCKͣtO7=2Z|s&TƋ/LJ mi:[y>>T3Ng5 D XM=0"20ssVB"4pcXmؾZ95ưVD=b8^mlzcgz>ghD]53n RŬrk&ؙvmvC}hG4ˁvZˈvymz4(nvM X5C3"56D~v3ol];C_ڱ:Kv #Cm,kq޵E|@;4vvw͜mf& fh7dD5w^&}4}錡YG1c=1~lZsI`Fe[Nk < yeS,{AU+N7kD:Z\s sM&SIdI*$0KI2e~7$\N7y18q㩩Nx"POH $}UTdXQ(n"ư8lrW6H9y:KHȆv.+X?\.X7,KQ! Cu`:YbJMlb(֠sZ'dl!((UH  HO_ymY&`3ix vfq{vZC#YvAtn/<}δwdz aR}:.&Ɂ>!8+1EٻFW}۞zE.dv@$ _cy23Y_V[zْAnUSEY?+(.qoR:}@ZNM$q5۟ Ľ{Pvp ,hs3g `KƆmdXy d;flmAuZցKcU o&ȸUK6n1BnJgf {k-24 /x1+&o 9j3)CҮjVlӓ(;A4F{z]1esrei.3g^9YSۯfN> lNl3(x9i^fuw;mp)GYF"K[P6M,#2Z}nTҋ y2@pdnO;jԲ,i+iXmSNhnx"/N$84i)_枦տɄi^0ξ^p(#,y!i`YVC-&z70%vmt&@5'0M1c嬀%G_iȖr,.XQRR-^-de w)ўDDh(^iYR # F?pt!B7 شwmOtmn[-YjM&%# )k6pلh440~!BE!f8bq!=14整H:a mׂ`٬newly'q}+Ao) t>^>Fj) Nَo\l:Pu׉2c6A^kzu.A>)f'cPNXr(\/ev-ىbRo_mv tQRI3Ey~ s0G2=G'VRV|,S;Qc iżhmvJC_H @nRu,h?"|PjԮ i^NDztjGFfmb,GA*[}=j'ꀠvs0gE*lh9GF~dڵ[Dڦ]Dg9V:zj/ԎЛ!R <:&/ՄT Z- v&j@ЩUDXЊyiN#5._=1ӛDzAg{n_Y*5ch*>}yVthM]Ӣui,bȮoI.?vY-Xl4A ZuGY[v>Z/c0g* Yl^ݣ>G W=⎸X#=1DqRֲ=tCEDVj[j,4_uPbAl\ރaN] 9$bmc}!{D"g6`I;gQjn1y1vڤc+M:^z]Rw#m QVҎ_q͝rvh{ rcZ,zHZv}!=h÷79#Q#+eBs@82]ͤ=X}znO]\j#C$C,`EvJFwAKvɌM*9҆yd}*HNBhuO2Я/GerӪ9c g !ۻ/*PMeM`hc#eP[LDmXk]+#h+e6)1SwhNݼ)]^?dQo/BP#F]*لg7':ĘH h oI=(J}D`@ǔ6m‘h X?XoT;Qs'IWn[o,bpy[} [ܩw/NхhMGEx^|YʋuJV`0{}[sމ׮dUj 畬8Y50 Ɍ-oq,S -#Iv딑HFe$Wh:cP[Cp}~psgqz)n0=p=D*ÚL Hh)yt÷*gzo[;??xP!lA=)$ew:|rz}𕇷qx<4G"e2|,f Nxuv\y=Y~_?nFe p (/C^ȸLe*| E{c+KxO ޏ.rP6\_,#tdY{,J 7qԣ?ĮFYYI>LUMF^GLoHq['EBVYbZ u~uxs0"@ sdI& )_o S;Q# ql1D$ЮJ@sظ/smҗ%_;WBZI%T<d2<+M@L#OMFսۛډg ^X`j*v~.0.)LWp] DVbލw2Mqlbe̞XT;cɂZ.k*=AĈ'aB,2@!O28!Qa(NF$[,5wK;5Ryߙ8 eA~-\E7 Q}1ecy=dpu-.qm6R[!3Vbp:M ??-BCpMDkӲMv3dU;mUGڰomȼ S8` ѥj[KعGyֈpzR?MuY.Ozfs?[ڑ2eq>/Pk?OieϘVZ]C+Gba8b|kQSW 7ly-{jyPKd=~##RI-:+G Q=|']Bgrm|V>4Ø 1&po]&&yF6G%1lD7.*ҠwuzчD=Nj;쑕\]K∷l#{ ih1EcvҀw-^<"b9w 'Xyel)ָg-PmQqNvԠkL4'1UQ>j0.+XqTfhl&<,Х&.vT뤜6}j_.}NIE > ."Νy t ƛjj_3A;Um֮]G7>[.huO]O]z.wٱ@ܷ`j:؟>6I`)GuV <@) Ȍbhq5lˠyV/3ECpIf/ ǫ 2&BGD$֟Юꃯ}U9g+~ !^M zxyӅ=ٺL`Lw>j:I1e<,#4~\)e>{5y dc:.8{$*X3Av :ųcSlOR16l&f1f >ciU|zyTmduOIQ%:qrZ6b 5TD>4B5? +}=S@o>d+ :z[˥aΓY{.ֈ[EۿԱ`~p.6w\}*^_<.' [iwB+z, W {r_WY!ؗnlNn%D1BR0v.r!YliJPRV+0rgܭL5# ί7,g^f$֟C Jl,s°Xȑs6X-]/kQGn1c}9QPfB&̈́纙tyn&?My߄|aĀJYBǮ5mn>gĢ 7HQ=VwQQ|CR CH袜HEujy59Q>D -iRBwmIrX܎ܯa+Ap%ml 5-)"o"PrH2;qHNwWUW=U]]etI(izk(C~D;w FqHѸbiET%S%<@Uy ̅6 pݨu Ol0k(3 {ք4RL"w'"A! |t39I4SpO$KЌZZ4uM/Al= BY9-mMpEKO,@ H=1?SS)CzgFFB^ey-e:Jm&)v?>qy_Ze!h]>rWU#w#7}T:X$*.Y"QQkLYJ("53AlT|6/y5?{5zYҠ6Qg}W "G 1k1=0^ҵ "bHB%-JAR "*=@cIKfz>TdxW؇Ms6&3نQ@Z1d5G^~WUi8x~\pw#:>#j|{U~3D 8 P- 6 !@}.3Buk)}6jdw0e:Oѧ܏~a;ka;< /jcmo7cH՟6ՒCDvN5r{wNQ[y$,"+IWQit*KR@DMjl?=6"3dNziI+lNę]\ }0T1Ĺ8W}뚙%}mdⲄtɳ昤\HH ov~3jC58]k0:D`xos\pp`fbDZ.BGT.HZ\ 5'Zz+຀Uf(#I{/J7鑰}t)Lҡ(tN*+pň%ڠp6(SrVn9-&{?Wc2P2y5W_7S崆P!;IxoGW8ӟUwכv8~ci}`Pσc !hi KVsImViҌӫI3 ُ12pn'R_{g` Z7 ̚2.F+e>dj$wzD].{Q&X-n0o׺lo/|z-S^UII_e1^tC^& 3m]mvٻ;-[p5""a ^-3VVҺDHҠ4`ju%eyA!}&<@e_팷B'ԃ&a oPiޝ$Gs[`D JS} uj ɘޣC'蜘&XZk~V%͢&O!9 ET U2 N ^@?Wpr!dCzf8' )'(\Ԅ'h0&qB9Nx$ brI6kJm:+{v({J>p$˗X yz,;}y_{^acgR(}ȌLcq]¨48 HԫfqCR\v} rky:KyeU %x.sY@2%(},ad J/i9Ta5T-Ȯ]) d'd~GR6Q1ٍ>ս٩f,XJU::_QcUZ3ƼiTdy:,JK[(rs0Fc7>k[J)ގ]AaǬYy d S)9W.;kCuheq^NSRy}.uSpv_9jO(Xksn@gߌWt;M%fb]{6GCBOċ\Pc/bTaZUT̕@TTbZ jźwRS /:Qc.%2R'u_p\d*.Ҕ^ = SD!hM֯o7d39PlC;[Iy? U{Mְr ~)\UX  (HI4I՜Sba'  Hwa1良rJ@+MH0SF24W дˍd<$N|1LA S033&=>֌ x#R}AD\(n*V+.O0>]QPx{^0dnԉ/kMQMzJK=⽮oF<=OޞzPf1JI/nf-!Tq ٱ$D VpBR(8`cdŀ5J(dC0I'դk5`nhNVn؈+_m+7"zwm+wle'>7!XxLgiFN28 _RAHH7Ƃ7֩H$mD(Gi5iǛlֻނ}f hXC!]+˕|/66d;uZjxzlQy9P=|p;7~Y BXUr1(UF;qb&ͺldQqc)\ĄJHW:X4X)Q@ C&g;٠hS͡ұ\'(Qn KU;>jtt0!dBeQvb!0P6a  ʸj`isz0<۶Je%ڋG@7q2Ty{sޱ*=H6<̥FkaqDe"=sVNHBN\D)Ι 25\ʽ]: 珮Fv'O=lj?>M=;j"0]}~7??Te8Wέen4Cכz)D)>'6"ZB*RAH2n@}>i+cw>?䦇 a@&>m:^Un:^U:NRZ(iuL1$t7n9¤i$:g4Ի #" "jh>i9ۅvOҋHx=}!Ԋ#6z&>_2E%PZQcV)#@>P8( )RZe*Qf4A; S(H0&EN܌pPd}70N:TKSV+f% )S ճ7d.[L+ovNw.U]8F'VI X׮*;;h>hrPii%,3z++i]"$iPTW}n6{,~w3W o'" ĸhcTcҿEj.]r<'A@ygBfH 0IRK##v7'oa:)G7u)+zj ;Ė_׬aww/=b[>mDfHzVT˹hIp׌qr<7ɥ0׾;;1Yy&\oFJN6魥;F݀Qrq͡/~&_M >2ḵYH;]rGnAs) RSEr檵a/`J z+SΔ:P 衉2E%TWE_oN&ޖL}(mܠ?_ˮVf,\$8{}v@| $Ma@e^ c%TRv9c4KI2hF#9Al楋 {44jiY9@036wmԅvOҋ{/ WJ($R dN+&gfR!qpS3£TFD::oЄ-ӌocmZ{نɺwk4Raoİgf4=İ]2s}sFsTqf9:y}/K558:#l0};NM;MnwLҟ_^ut.^J?w'mʳ/H/F600RX$׽!)(b9s,cq9ƍ|q#_oכ|h6Iaד@ji 2iA%*+H K)4"`]&%s\9?Y\3-rQߢҹo͙K~Z("9]K\H:byi]Tݫ}Kh*Ug3*6!/E)VF)3u$n-3]2uQ%KF[Uǒ-=y#o/]n25S~K??t_NI>!8FwbX!n5%SkOͽEI)mn{/2٩<ּT I&7k>Ts&aVk_0ʛ;߿dRRoXa-$'k `FL[m-XҠ*|4ҪB h1ݜ~.x4Fºό ќ[!2XA "Zq*ZTiLPw.2`\O>=`a=)j;V8l|g㍂TѴ%P `/K=^dT-(&STxzm?OˇRx IB n yҕLΈzI lM F%>eRTT!iÚ;"R o5_@rp}l#+DI]R#J|l&obgoExIeeՊ3oh{C{#TBrmٜ7i)ys]wqZv=wY9 =Hwؤ؋$.Nl#4 @+fv6gyY̓vEYǞ +*O9sQdx X!j(FF'ߢߔ'a qZvU9Tx%FGQ*#ԙg?-NMIʁ2D6 B9`5"haTh$چI"XJh&@ ! g1XP%H"2xtj@|trC YǨE)kt X_M׸w6Y^-R?̰~;5 ?@…ʴ:\lYkYAmQ]X$#Ts bЉ&Qdc+5P(Nw" R\T u62kqA90b6gubȢ`H@jz B $R@k5YQ[-a2a ( ,Q@ 0^' 𕒄6#`ްF>GU.:Dy dvR$6 I,`lD)1 JcV2D\IlDQ+q3$S!4 (H7NT0ԙ^UB..Z\$x(P SYTRH #ZVЇ5aH!8?$1(EPmKpMEb~W_z `8HB<'A )!*H,FpI2i,;+J#j  LEl؋щ)~4ڨmc6YU9½NA 8MEVpn',T L̈́빪~ &O;?@ihxJ7xt[1}Iǂ|Rr2uktO:B?/YT *YwTX:JA4[Ԅ@4$rS\&b%:bPp 3P;|u^ٲDZ"[4$ƨCݚFAknwM&v"m(R u(GӍoU!sFj%욱 Iם " 0>'<-յ.fp\Nq.|˩dQ%>qpF $Ҋ2]JSIup_>ˤzeVs8\NC( o DA[p[2MX_uY3 ;;W4Hi> ?3MT,ÀZBoVZ,0@.iR")ʇJ unWIt_x.R$l _!nd/fZM6J`FTc) 41`D! (瑕* IҴ\A1UP)Ŏ?I;24WD*g0~ߪo|T zpV2Q7z.RmWRKҖ>*iȒՊZϽ'S QqB5[sYszx&% ν- ڒBPb"*(|_>LQ/o2b+(x37M;ww"ڶ6x~Vek$ܴB+M4'm  xVq684M >۠Ț:ع!VBX@3X[#3|ȳ3UrdmFHT]Mi{Pӱ}c$;/+aPkNlZT'bT]tZw`7dؠJT՗Q5FG"(pA  V\1]rU;>)IwCZgg+O(=>Y|ܠi"/}j3~d$no$XBfiPYI6}\Er)IZ&O'h.C } cZi&yS>RCbJ +p-o~>tY f~fP=D9?<:Mx^1C ۢ&&EBˢf IbZ\1Z hpnC!%)z.=<@T >ukX7ʰd6?P 9_t r3""hǦHЖz_ރM06r=Q~Es9ޮom z ^3gjH )$9#戁p{ߋ~ ݴ[mHHJH(7(fģ[0tW w &_RKjQ;ri Un:^_˛7/%薩 ]`>?Sgz u`([o`oKv81Б9i?ժ3 $ |An0dNlqpVZan8DiDX+zMHkxr +c3ޤ:57a'v̎Y܍giͤAunO&x9 ˓lޟ-.?|p?g]t4Q9BL $'?>W i9x3dЎ긃fHsU{Xw/9fw:| 7 +ʮ Op7Y4C7.fYuLY0muә"X*0i,97'ܤyrB`CityEL׎0I0* G&!2(0nJ,R#8C (1ڍ(H_)I|[7nmt%אn0uMDӽeahwR\{\K;)EKrv@<]K=/Dvd4/r\ݍǠ/[︿|Lrn%{s!ZXI8P(pL{8}5𸹦cRΙ>\.U%bXzeXްwZpZwswaxuog|h>۞u? 5D 0>Q zx]Q^}plGE]oj!e!|V?w9R4܆Z<6|UVֆ:J#+J;DQzZR{( ɊÅ諗pw z2$k^| ^;0W8; 7vZIF WOG|i(̏/g6ss9\*=D˯:dG-ܦj HM7@A:iDЕpwV4>wkue))kJOSzZhJkAѼGe(?Byx,E$ 4¡aQKv1oyY1ߧp 2-X !QHslU(5Q0i Y=FbF%<-|5f񻪢⵵O5wu쮧4㽲b~Xƴ\Q51L\eJo}.~+$9A3Xi+>JOt*=n?db)9i]>iS"[d4Suik}RWn>9˷90E;UqξH?=@p)yZ&`H@j: !LR &.7 YL$Iz1B!Q\9E4Ssv 96cy=rtirtgWYm3FHqVCsKwV1xmLt[i7"3DAZDXv'QeU*w؇f Kܝ~-kcIl[ٻ޶-U-<f-I{E<5JԛgH=(YIQEHə9̙ߜ9]O$7vE#cuTuË2ըR晛QG{Ő@8*Z>$Dia &b\ %X9DXVHCC>]<_ЏĤF=/jn $Vq l_^ʂu,4` wH\%NmBIJ J)ua*exLLEJ, t fV)B1M!҈ ,U~m~<8M@9`j9Z@h'pal ec2S\ F쎍g[(΀I6% UGsNr1zwi^oM~O'k~!Vp1:j0Hz)zaH@Om@:6,BaBrCfoKMݩ JB놟a&M"(|yAn7.瑳e%HCad3fޓr±~`b6hvX^/($_Im &1퍑$Hf؂`MužYPj5$LrM1Yvo o{n]M ^҄i,px{+',73v'f~%9`I1T-I;R4qh5*ɁjiW2mLpqz$eYc L)vФ];d4!LP}Kܮc%x;ɪMJOb|<|@kx%i0H0E*@k/{-û> ?ؿ\Eۚ hё.bFќ?]r9Zwg?WzTJq} m8CSek]eSt[ F :ólQTl9.l Al;[Ogzw;e&'S?Wd 1[$7ԙhꯦt gJgn畤LZ '~f2Em;Qy|Ȫi$FN,i8 ΔgT3Xc?N$M&ZȶuN-4l 0fx8=6v+m%ӤA~rnpԽ٤`.H`C${-q$\\P1^r|:|^w^.zcyo>{4gw_._#[E ףM>?p /ϝgGNE;[Lz&-*>q"͞`<%g٢_OF@Ɠޥ_oۭAvuqzbۘ}͏{m y睿<.Cr1հ߳;Oϓ7?gZ;9F6e<Bܔw/?%Yd'+Il JLt afw>W :$YU~DŽ>Q*IA63-Gxb8#jdli5qtG7CK9eT/2kEfRŢ"cr AoafcaNG 7}6t Cbi\UewWj$yFcgL1c}YXX7޵qy]sمFMsQgDd[Hvk ZNe A$dLPUʒ+ycj^Y Um$+6􇁝 0q? C Vg~vm($ QxJ\YLL+&@s#c[D3-Ҡ 6Nj|&0]*`S)b2ę7HT,of6ea)Ic g8T$$FppzMYxʝ[, zJKx(0PDUJA{mƗ c=vIJ( h4acQ8HXHb丬CC1=vg#ѩYIU@;&9XzINZe}{*QS^2q4w?YQΨFJntqކoY4i޷~^x2:>E5Sק8;gYq8I>ҋ1,E A[@Y @քw9#Acv#"/ a; (+К>6#F UJs$$pj La#J{ܷR6F[Im( q+cb<&YZ3\kHd-0ZM].,&<Ю8N)JG f΂S-WTν5ƕ Ւ_WRpO@#_$RϴEm@EmQhԝ_7FꢟqmV\8֬uBq~8Pq~g]AO1xX< cЄ nSR½zrlYʥ4.m֊yr(3;`Ml굋% ݔYQƢ -uG<3}c=Ka!7fSok_ƠjtpMvdTzمp8E6it^"_w[!4e(֖&p E[s; # (N]3?*Ȓܑ!RM| T!=ACWp+(./ʣw^*Eal鿕\@wӒ4;9帻1ؠ9V__:ͮ$i)B^BPoay) ޏ8Hc7hQad3APvԹX9/(^{ϼ~7z9䣺<Dp~Sƍ9X"W;ON'6)mvX}{ JiiR 67FL71=k_XX S]B+yfF881T#H#1iŞ臬qTGEXbJÕ#X[gesV)$}.!/x*o$J?xj*+!J][)) P/.+X1@[G@?"~mzPħT)<@#8F@JkUa9k&MxPG[T5fK6XT ld%}c؂%-c)i (1X!%[MX*`ɲf(yZY'B3Cp`{f%Idr)"-4nCm|HcI)ȴ \tyt .R!EQ:b+R8SK4}22'cQ@$ʦ"PI kCiP-t0\N3ZBoB3d[PҷXhJK,44Gd\i?6BXhZh6(b>`oEXxG'eHDZ*k@= YCˈKʚX&txՈS&, fJ(UOP4DExƓR j֫TTcЩU"m (P1hc@EC"ݕ BHetW`C9 &,׷oKX0`'Ŕ!ž*˜p`Ns/ \܃̳wcvnލ+^&b.Ru+Jb)` "hY.^/Wzj6^cv?'dYݘ,euɲw(J s0\]jzjq)˄31EDbW nCX_5֗LH[r-|~ouQfW4K<1٠Y8 "\Fgz8_!س>v&#N/6JԒTbb!MaPL-9Uuuuu]F= xͰ'+! = >clvpxMQs}=%2v2Yb%sET-gcC q8B JaJY ` 3itTR))CqIdC{̨0Z`k~X/pK8yfՄ }?5hͺ[Iu,QxI +hl-RcK႖%O'M ǧz瑰D^ɧFd+I ዔhXKqwLKe8!܀r]b]'+ ;3]]HMYlo#@޼Xf5`L}N*6#81ډYiocϱЂcyO>7D9§%*ꏒ,#U=w@O ƻɤ8잴NL41[Wt{T*+&SgW)X}lWhӆyplTa]7.y :aܥx1C ]v6JI0xQCv-/~OV~`xc0oH&4.+K3G6ќqM7n|~t5?AW Ffq5z[ݬ}x>}&1%{_R](<>,>8J} Н\\tp歼1tw0!UY@X(-R%Y%DnDJJ^C4 3ai "Ht Q+o5,{!%(U)3G( S,*:Udz$ SELm0W\>f6ٖ q515Vۓa#ab+|0Ίp*z8W EVaH-T3,#f M-ݷCp} qJ+Q?5[mW`J71&5M\puk]Rae]pZ_"4V'!hBRQw)_jޥ *y+;%=4 8f1#&ʛ-!"g&\ӎw 4oU 4a|+?ɰն_gҷe&?%C@&R00| qLSǤsv|@AFI&$õƗ=ڸ=/C6nI= +AT .AtWnrh=,>'FC%kJ=3AT Sn>?k{rwF*zbɁ by~vƳ'u{/kSVQDpޘ授Pyׅ|803\ V |ēQ6aɆgcatĨ)ac~x]/ Rӣ/n}+mhLlsOd|3>T-dlrƓ#?Ƴc7)S?ȃ0쭵߼O1 #A~Gp:;+SLWsfvcNMLLٺޜ݃.5#Wi"j }U _{Zbm0kNHBp͒)!nT7"?(}h"N;;o sz/wx#Y4qۏ&cs1HѩhlTn/+w!!_-SX84'ȹ"_/o ça} [ R6@ *_ &Y=G;6 %BX/tS03`B_"yH! f+!=y5f֔H#=ߜW#1/$ߜtU)py(`Ҳ16m LuhBٮhGbF #1]n2~QyT^jGϧ#0~,=p]Fep]Fe5lB)47 ڸJ **MB@yKl|i.aխt/b07bkFE<^iή8/_YlktI'DR}5@Has1`'ekR tZ@' NMKUZJ+2,}6T^>ZjNC!fo`-PD[$@<S[8د["Q[N|;--SLb \t/u0jܕ9ہuB cY.FYQ:Z D+ ] SYZjlbp'+_)kx>uGX rHRjLjTx@|Tuڸ̧qݭ^r3.v?a܀rn1[xDROb؛' .gD;x]MF+ (z6L3gӭ  j5í#DK\8W [8jW 4~wivD4 ތxrw]?ꬼ/bo.nI-'Dݻ!>-1SODD D AFt(1#`z[\d7 C2`JEuNٗL<šP3VgyeҟK-1]' 9{6tOrcAكJ.{3ﱻ,$$Z6b#vh\3%5X]j_ n/1S +AO8`# g)PLW/ HȔ`zE  BBpKgXjpj$qpS1y uCͦ.82 V@`w~{: *%GӚ Ѕ>h Xl+a)22PFx7@ FƯMަp,E(z&^kMR# f;TxkX\HN,`K2XE%(oXH2%pu )oGL)Q2kkZ)-s6.ږ-ªz ~J6Dbf &!ZʵO.yB,ɨfBD;g19h.m nZuun{ e$ D̗x1JKAfmvS5Dӌk9WtCabV` v'6 ҭbD4'=IڀqX%Pş|y-?좢J43-^=2ђ:T$fӹo[K hyh}X-AH^`l!!*-,nĚf g;@ȑޱ퇯Vm.lX_6eJKnVa,z|0 ŷ;1,1;e$ @5qIIkٵ륇3yYM;o:~f}o/˗IU6?<:]]X=R%9 >~z?]*L #OMI+Y|Nzg4E czEo/Z%Fwda=ao_ҏGOG'ǰ7 =tg1O/דqy|u~կGFQG'G &ė³;C,Esn\/P?;|~Q\/~.vQv/O/O=ǣZ G?^`hߗAytCu1ӫv%7:W`ǂ,vҟWgUdtqS9C6Υ钜B`S]/)UyC]zf_8:럗wٯ:wEƴz?%>?)٠ N;Z7UoG_oŋ8W'4cEaǸψɷ:?׃R3Lm<==qucoO+lFay9KєGQ1 gC.>~D*\~~`zO;{4>ȟ8HYFOzyu=/7%ߨk̸ɚ᪟,Vrd)* KGs|\{!d {M-ׅ e" HQ.XReEwqK|r +G"ڹuU9ycd\`c8ģ<.%z%7oϹa<_.ۤ%¨t,4О$ q/eB'd:ϡF$ Q[\1(602q4>k? `.4jq1J"4Z jFSi B( XF( D.IFw%j1s< uɈ k*ZqCD\-Վ@2IEAl\;)A'OT ʣ*0A Fe(I= ƍa !]!cpg}WA(m6ѣ#Fs]ByS&6`e-E(T%ehE\=54mS_9SXY_k38ӹBTtSb'vEdal!Ѝ1jDꌷݨW$ɺF3M0J(mԨQF9Q7jdI2oQ s ݨQ8grN! I5a7L1 D`ZT=;yXHQ \(8!!_3TrF 1!d2v'2g4 j5I48E8UY4 a=#tJA~eEN2D؄xF!Ȝ;CnP lQ _j@>#s 5-)L$f[ xfbDC2)j$`L5*\Q$sg*qZn^/{\j6ڨr|hq6@?Ǜ9=R.KmX |i GCZ^ka;c+#qvC_?/[jYg6m#]8JU~cH1RgDHjGE 2axSo1`8r|=mEپ],y M6XP@dAHɈPRR 1Dzs! Z%+V=*SKH"M@r5fy["H=N)jrG*]^;w.a("j6u#'HRZ0MQF-i*:}K#Yc1wJDk΁(t J< ,"߰28;`4!mߵi#Yc[,[A)OR5q5?Jj5{GRfhKD5NiڻT; މ` .19gJ+ m7"+R0OiGDPcVBR>p#{}{:0ݾ끵~$ԡP[yU{6ʍk*>=FlMPMPRL %6~;Nw\u%XZ82E>NjC_+e/BRh]ɇc\{pTV9+ƵBVzk]K5cJYsα[\7P% %#CVY$B8B0 Z1paŦWHdIh}P8ܞ>QNӡZWԋޒ|[ ^6\BٻGަ"w8Yjml1~ӿ3n6^*>PEv2,Y~TQc<\^aR;C80  X,PpQح:ccaZ]o&jnq]Cx@܉>ĠK35w ςWJ͏J_|`Jqhҵ!:"Кyio4{4q^RKzMwɣ<.K3]y][ݺx__}wzx6ژ{?Ǥ퀕-hsi't] }Rj=jm*g+o @x#=콢@X@gr2RB` F}(jB-(Z鬪Y[Z<2 Mg4>9| ,$qywf(W.)Z|4Uѣ?LF 9'Bd_r(:)m'O/G 1"&+/jr9aCTŨ._FM/ǰ/ a?G<=,_ScX?{yA yL99)e ԁt/uHR[,F'k広LDu8X1dBu[I{B $ :9'E/f;X"a(I:84h::2D1baPA C $ <)20TOLP" @y)@#-y8fh Tsj,&Y)"C7I)FRDY#cJ *j+rt^tu]hQ !X R U :ɠL8J!/EnӅ5:Bt#*FD5Ns0yʓSKptI2,0!HWw!!aGHr*H혏DvH%q)V 4-NGA1enY!EAƋG&@RHF=*y G[Zl2-u.[zHSJ[mp,ht#%/xynх0dר딄Q ,rF'ZAA΋C!1-S޵WRk#`6$ɛK{/he4h׌ӑ[`Md!E*6: %Lϲbt&E_egt5< +Į[%x(!@.ۀ@pi <Չ,R@}vYxE/Y #+МGp 1]\lZyRXe#33a/q"K[,%JSR7PLNbM4 f9"(y t(ʤ7) @lLـp8T E+}5uJ }FA)[Yd4PI`[5("2AVѪ$-"oɨz*ؒT2@` RjߋޥzBނ:Ugd x*WXh !My,`R5p❂x$+#8x١L c^e:-E \9AhI"I[p'7KՋ K{k T X=:]2*(kPFPyL "@ ߵTc؋At( p&j(\%"Ld #K4LRK-g`*&5G*xȳR%7\I1kV1\v¢HyJ* ൎu ᦂ+`G&R1Z',ꇬ@#"-NԄTĜ/+TY-&+<gęD5V`HƆ*h0 U- (Wnݒ QDybь!}5qZ+Xթf^j €@_ =TEch@62Z hcx 5%ei']( UQAJS+ւڧY)6eQwa<| s;E4 b- I JΘ t7HW4QHvdyʣK@h[ŲDHĩrGyPL L22tV/dUirB# $.ڭZ$ /ܦ:<2=HOb*9  -U'a ``ۅ\Vu*,|_J@SFM @F$DؘEJPyZ]@&6X/xҁ,DGM` *ۂ$vJ N H`z(Ӭ(a<umTpvaXbևIř]ΡN‚.&jH`QDDc&'PGs6Sj̿MހY"}Y]T@5ʪ+)  =LyxDX*#B6Wlc3uV!4|< 5|u@S̨ b- 嶂2B'`NFEIƸ#s4pqFY#SHNI`&Yb& K e\YS-@5DX9:aE^Y`͒yXe(! H>K4К5'Sr,V`m: a,H#5@W jTrVkά EaemВZt공f&i X"h";j&9U !WKT & kai̓ t<, Ox:e5LH0`BMW-d0L&Sn3a0 6|y(Qfau<.PkH4繨)dd+UFs7QyG _SD$ވ9go* 9QpyВc~A4(QR(.*PP<*P#F' \ {DX m#2)2'(U.)嬇##|ñ)nWaܭѨƳ_.B~LKj\ŴSnnϏ??Y}|`?ٻs7@ǃdx<2>lr?E6}p璍?-\I*{<ɣ/5zPQ#[TVK̫솵pM\GvׯVh|~8OK;T~3.uM/K:[3ù:S|5D>3J6 C6Vyo08خx2 c:/I:4(sy6xJprS ҜFa-!PF մ9TXմgҺzR3!!B'_!)_.އYG:r5.M`}Zy׾-W›7K,{E?Ӌ"~V8E6KA\R)"j*Q^<-UTϸR;Rx7٘#HbgwZrr3"]ymą=5a'|{&|w [+en?n,qyuAX$Y9  mlz>30^+3i}K}0j>fP e*0W$r1tU^P'Y`> 4YȀouz܃S&P|f=we?g͟tVͣ-1.!`ڣ{ %z$rHN2#!񪳡4x`Qa@_ס1ߕpϓv"$w Mԏd_gj~7%{^3oYa^0b?ϯߎ'7<'wiίqN6F񂅤 ybuf|14~U_gs[w=Ltt7_y~Gu ߗw<;N{x|E'$Y~?Unwx&jgJޚ"צe:T#T1gT.PY ]b3U2oBɋ,0V uұjrM{&u晎yZЁ5r`7l)t5*ZMwnr+e=Mne͍Sޢ2JlJ>gvhvZxd{jg99;[OVj6(Y")dd^$~l/_u[I+b'ƉcqDuU7-{n pLlp<0DYB'xMRg/\T%uu'G ʃ*L靽;?.W@ld͹7l2B]fɉdyu{d͙ >򔒒TSC!f95tvqw)q_un4Bxj!X|A^~c THl?pFt:8|e`yZg/GiwrIPBKXpZv" VJg#2v2]qr2ƓҾ f Yl#{3UAG8=0o/jq&sgrV9΃%%.2XK.9!DƹXR1bi|GgHKRG0V+GV 恍i^GGJk`juA]}6,CY;K}U!gҶpn6F;kJ榊c!EZ%r$lUMw X*ȹ] :e؀g>>[jnx0SoE=&|gzۅA`cTীE2ʖYj":&~{X,2$&\MΑ7VUjiep:il ΠNC6og}&!2efbd%d#k![U Bڠ-9k٧|8CR=p;)!LZ$`=@W޳QTQ(tFvci(\% ".rۏFXpr'~~'P ͻ`1R{t@,:ы枲~;Mv D>^3:ocżsq Fn>#(9؅= `wˠ_f<^un9{G~?>qQFP3z1H\F ZVhH.٧(#e$r)/IFv@S@:sx=iKR1nؐԧh ۊ>| _pשw7nX}Lpz\$8Ld`I' . $nn-Qg(x^jgJ6仫SJ@mMŻgkgވ-//g+Lzל_..Ww\[M=F4db?MWo#ܣ?ZYי͵ /f!hx=on>?[˶➦/7ͮ}7m~ ڐ0N)HDb^&NQY^ )zϣ{o )97]` {sC/qm3p^ &8:\q1%+XDRɔYHm2mQqqCoxéo܏1=k))AomPOj;,R=ZsP8ZҰ17-[ ^ 2l+\ ;b; iwjVjk)X>F27>a㲐}afYe Be,d8!BeB>5mMOUz~nq@u#X1͸Ytś9jnbWeZ]Ͳ~׼.9du:rm#[kШϺg[\~bdzxo~~<;岖l-W^y6=/W2JY[ޟ#fr:2ë%T8y+r~|uIf)ʫz3Y^p> b ~KKP2K wf_V\}7M.|'<÷ogo)b.MTMYG%±]v U(FH>y Q["*k\=e~_^v.˓@i;Na߿]zY33t\}?qWbrτ~aJ8$Frʞ*X*rﱗy1rVL/l}>S^N?tpKw[/ rżQk\Uk|N:V`oˋ)G]Fe Hl]>yoq>/k>3^M}FSNo-49M9MKS"kMmqdTUSe-vƣghrqSoZjWbMo6^Ol&+GUolpmCAmôIFhnf6éB7)OT~TT{''Фa]"jZVZ,un}fSFEE\7W'_~}~՞MW3>#ˢZ>/}jԽl_||nW~?N?W/xG@˻Ϳۯ[ś+Aǎ]]߿W}|eշ및oX]=c[[nsQDґ.ʢt)2dg(]QE[kZ_ZѢIQ;} |$Uit]cj}\DD*&:UfJ0L5ymH!Z'Lr(6%,6uZlJA$֢%T(cZ"*cJEU`zQ\mIڥliF'+a!Ҏ6!DUxgv6XE>Ikd9(Dj끍5>it;hK)-YNFWgMQ5RS֔SYW):wъ*J'Eֳֵ)B|^FdK9[}yrw^;G DH4^ph2@0b_HΕB匟3>@;.aڻNZKRIMQdi+,{4}!jH^ 5)q;Lܷ32CB$c56NCna0_Di>8 `pD6WD=Ȧ,TS%[+>(KTRT8p.k)na/d>5DZQb8횂X6pH6; C7CmTVțm]ʴ DdA `L`"!rA3sAttQKX|/"@zVp\\mB3USAĊlЃdFBd `Reg4pI#{VGDڤBH d,N@ r2p.BmrH&;RNwν|9y6~H}N&[J:(jP-Sv D\@ϡݭrmgz0: 8hQ%' Tt&W2iAUdkuSڥN`2\Ns(TBvBGO,H(V Ω [JTh(H٧J!@ rxkG^e جD (W?tݙ ,B ²5^AMEgiM+cg 3ƙF66`HǪZ1GDCp:ï^ 7Z(k' +/&0 Na% Y#FbcZ-[(DÎn<@Kp /Bm%<4FmDXpZ+(($.9 V,E]Q4Fuz 3%)2!2vG<@  kʁ`g𰒟DNQ0(*Op~)ܶ9mZl 3VUP'7L 7ZIՂ RLo{)Ex]w h _$,mӋv5W(hM-KD}m Bna=젝{0!_Nψ#b9|)$X p;\vo!H[7n$+i˒l  m٪Ȫz(t<_#Ϲ2V&` PhS7 ƭ8q-.n#|`MD߶X :j-պ5P;qr0GCk{&cTt Pk)`pA-Ƿ;̵ra% c-ZJ3Ek Brb1bJ) )/ڐ9!A k 9an̕01FR4aV`%>0w&@8s'yfoY)n9(i ʇڸ )8A72U:tV8 K0P}߯_vVEzsc͛Q#=mh(ᲇ'DCuUX7[}wFi dҐ|`k)\dbEk)`cv3R^[=`rw-ťbtBqn=@{pvAuwcxDI78I၈c_+x'zvtu}{r41tr{5=Fw]7kImtƨIjt 85Dt^yύ‰FwႧFw6kHwFw-Н%y? Q{%t)ķN^3SHgi|[BvJS`%jg; ݝ)8ph!K8VP0)'7Nc'7NQL$Nn0FP82^o#dij|–-">3=Ҧ"=M t9Pt:q(E(X*BP"' N;\t֚NUJƮTUJD3oظ(sh)|C⸩=/f>]_7z3 A@}t^Ф:_V=GPbu=*|0G5g1#@`B(֘mZ[䨏ZO ќyyg8WUxah}&7ڊQ>[v{EȪ}Ur*ig>Ň'6SG<RJybuNj91 fUćeLN!dsL Q0ԕvPV ćeZ:{'>@ wRjS`௭a4 vjs1ƍϋV(٨7#SViЬa\p*nB^dnʇIX0EYMm%f$Z*aB@k/>q)P &P)NL wYSŌxD1BJ(ASS̊3EY!<Z'yD1L1_S “;ם=Mʪ. x:{DzJTVj *d-:X(6@,׆6t<ᥲ s. 7{zf .nafG\6Q{lG\!3É =oSؑ@D2j;se̻Yno̡ nizr Ռi֯b%XYO1*KM1N&h5vRюa=lWz[d>7dT[Es.ܿgM)>Fctu[K"dQU,'!W6Tۀe%.ր藨͇rp7U3'wWҎÆG3 /ꞣusz5)8d8hiږN)džOe@aD繣?+#2dI N7Ը@ Qmn kITΩCx?#+M؁ J^p~ *\7iX=~G5禅RRz~ޙRG1 daX,ƅ;y娏u#LW;e u<js@=,0Y&OQ~5]Pi/.x=N\&b!fnq;_\˜/6G kvRV \1ʺLHjnXӆ5F,><-xxTcg~Zh0C!Nֽ[Ҩ>fC nhO Tw#sj7 R~'Ikjty7G%& nRrr0֍>pgj6BRyyq?r+qjKHb],^W?= Hػ\.u ,~=/ 0Ă1'ҳٟu˓̜ #WCR3 GhdD}z 0P_AY:IyY,mʶ;neK?طWk8͠E;)ewr2!D%"|[L%$qPG,5xF(U'Cn!Bk sL.ǽ`<>4v@nq#>6@o9O0KY˙9ui }fbk|OnЁ-'Ws-H*h{t PKlV^č@QR#ϞmGj1Ѥg7 Uٛ͐MU ߮ulLE-|7H:J 8WDYGCJoe~}׶HH4 U/\/tR8c/ҷE$Fp~0#N#;:>vb ~8#ϋ3-IGvg Ąϣd=:\{*qf07vQj1&%u œCYM 4f]+ޕzJ<ҵy$}T3R ek•O1 PU+S,/B\bNJ!~J{鿩PN򤙽xEsWo ZqRƹ8YS$?5bR>x7T=+1J,g%HF&pi+vPZ t~Į}IL*ZEzP˖PPr&wxLMV&N&4'pHXO@ IfUOk ' hPm NmPxhk֪oEGW>Rz6'AҲGh 3D)4F%g5d/[Bٳo&qa5&L3s\OKR:`5q m`}{zRU=VCtM3`g!N\3zZ=q gNahY]Vk[${Tk"{Z&|g߯n'zݵ;ًWɟNXeqk=tyyV {Q *Gy@|qq k#S[B^g/7~ae } j@A]qjjeJEJ x%dXEנJ |sCOx+*:GGe<7PvUЂaCjݎ x5X:[6⽩TکjnS-V瓸#TѺܵ>MfyQxЂ+o=-ޛ7nͤ"*7}o2 F]WjRiE5S͕-S?:GM+Ƹ7,;{<60A]{$מ`RRkC|.*k jݛVwPA ;WsXR.jfAؒ(WV7 Aߢf iDfmU]oN v(Ms>In$pFJ(9Hp-;r&P'SϒWF5>P.(*@eRq~`vhvYP C?}>RPhRQdTQӖ`f``x^s|r/"-mqzyDVꓢ@e~IX)^xэ(@K7ݹOaNx>:`Ӎ?lL7`hՋ@G||eUm*ZbɶW-lP #R %@^ھG3Tk8'%XrH1>( ㏚$(#BsܶK 4Ϻ Z"@2mXs`/a U-I2Y;O LIBrɚS`b-51@%md? !@mk[Ř m5iZS1͕'?Z۲ $hYI \.X͐OHy0J,@ A=;"M 9 לp\Qr!)gOh9* AM|S҆];>`o}7! $Tth+x&}1*Y7uQDq߹S %aCpj8d BZDln[Kn/͠Fr24מfٛ#Pkk5UU-ifp@N%Qu0 b-˴0 Su=-0ڶo;rVVt>{ @'@ f |7I`F[9y0cU*Bk)vZE,paXK3: `V"mZd>'?{7BMrR־)O 4kr +̣q}jd xg[j]6+h?w,Z13f8Ǖh|~6|*ˮ`WQ:4Zv0όQC3g? hMD؀M_]\1j*:ԆaQf0 qa!ӄQCĚkfO} Wli6楝#Z}Q1Fk:JsXjӞ뱤T?wE5 T}4%R@qcuGIX^dG&O^c!5Q~b&*(i5}r@6{hBI\ƕ5'#g\0ԾWPg~f-1.?pEim'y*Ѯyczޘ )قh/G*mK_%lriSQ{D~{J\궚c>:`WoԨR05[ז,,S/û/$SL'Ѱ hJRO'Z-4bi*Pq2oRr_s꽫HA %Vpi\׻p@^:Ȭ+: o(-}bP= 'RԓmRP +g]mΎܖ˨6JQ-;`G6yTp͹0>GSp_XF465J7ت d#~-,E^Y̓u;ōF_v1CXPh((V68YBv9rҖ+!ϓ@ پt.H׉j~\Ɩla~"[g$`5eR 3I,Y'$Sg զJ-r]RVjO vĺ瀿 uB$e:{zAk-@ %H +Lj7lN@=s(Ć)/ystq_+r'9 QR: fnq5h9[A!Kk;޼A Qʗn[ :o1|zǫlyC4Ivd4I<DYs]}GˀOĎc:~ٮ\.4sO1~c3gS:RzKA|U?t.+]\<Fx\4wINdSгlp=o|K!c)Ҿ.OYMd᛬Z8_N1 b^FxgKӳ0Ì kֲ/Mݟb~x3JH^(#?7q9ss&7k w6h?< d{97R2iKiW* ʹd[c/6NvC-bۛO˥ 1Dnjk^^,z2 S:6$GQ0dF`` b8lL2?om8QlkM Qaiұ%Z2>|IW=Ps)\*m ^q˭~u.{sFE jIQ%{_~Up(U~ݽKu|rUmx {iUߛ=q$8ΥpyE $v88Ckqݏ7uVwaQw_z{ zȐ[^ٗGx>ͼX< -uٻXwϹ(c6M~6Wo/΄;L쮾9V.2kFB"Z[њ2..ty~q#~`FvyIC4+@7Ms@l5_%\gHiϔ78G3nTInhp]9r~ӉhBqUw$VHO%~r,D 9(d7C%](ߊXSu<a dߵQfݜ/ݼ( 04a/'rynZhZAxQSZt(s(^zA͌kMLPR\xLp8 7 &*>Y/Ţe^?ed ;ı<2b![ܺE6ٗ( $ZMV}bYLC[)LQ{$T2T$!ben*[ERcDYdD9@j4Q~ c_KL`0֫ \,8Nk4Ȅ$ XAY.3,KJ4 &q-ʳsvK5e3^+%e5%;-}:XVI,.::}՚awtC&O-Nnc,N|ؼ.11}rcsfdݱXln^:fx׏Ѳ.[gcM tz)r3k[B+@"0292vށ;*$DpXUǻgX}[FdIH4լ|OC~)}׈C3k nc#xǟx -W%)"J(۱_VwcΤag5CRN0_$w⩙Poۿ=|?gJe7fh`3OD 1fA=?o/aѣߙY0 ^!% P{X >n4v= AtZ$_ iZI@YG1%U"FNDrqca_7I+(H,Db2ISr,c_XG[ }bw"LtYeqmxl,pP 2Qz'Sma-4i赯kS^|{ c?@mm{v#|EţF0Ʒ/#b"i5/.v Ri̴iY{O :nx]=QCL %:j|1BuG!Sgԝ@r3)R#M>{]0+.AH:`.'"+w@=kxQ|ST5Ok^(ZT`\<bu1#B@t1b+<f]@8EAFP@#s9s AuF X^XQpV?[%9ȦX>HWikF-_cP?O-$eV] &{keFnФRUUKt0G, كn 1?>k;uֽZu7/yR@B2GB+LP%ƹVPBk XF%'Kͻ 䭭gUUک1 f 8unS!kuZ͊[:Ѹ7aH5~ޙtgw;3wx#dľD(8+eiy0*d˕HK eQTX 9w@A:xk8"[zqi9phTZmPi" )+RQDFLRQ)XHyPɋ3kU|ըLjR}{ej˫M ۫ Ejz+WetB')KN(RCeb/g$NmWGI:lgU@-?ߖg<$9Zޓ8A`*9Ș܆z ލ\ {u65 9]:yotafj^UK5?TR N<DE &w+WvJƜ.[FK%M8#U 6/M^,0oq0Ǿb*G4Ax"W)q~2ڃڎȽ[D#}jiy7\y}<[5%F bvW@$phh9&ͳ㚰yUp 0k)#>Z`_8cY܊,JDJY8HGb!N8(r^A&@,V c+d Dhڪ[6bS2@ v#w Yc!DX>Xѓ16%JMcaȘr@Qr-oc Ugq2GzUzw|#6|ԋ|":@^KcwܦrZs~ bu. ӂc gY(v/O) ݒV+T-ĶÛ" q}5 ȹcƏV*C&}տ+G.K$YLR+8a2Ekj*4yKS*STdp=Y SgǷFvIN}xO{fu[RMa1dك|5_}QM[R(Gb])O @f,cldy cV4D:S)5AiFAQ%b:ߋԸᄳA ˱u9F$ jIrp\7{s(hj}v9p!D1cxCt2@~6"f;#mve"7hj[-+<Զ64_۫, }mw'A82n¯pXDM ߺYU0{O> ~v TSj~@dvuU@Lt|r8sGx}u|ȮG 2A}Ng$XtLAu÷Wf΍S?wvB,iʯ},Z,J"uog XY$7{3ص%h}1#CdSF꫺EfVUá!XbwvUF|qt[i0P@ul7s` !GOߝ:sd6QkO6t!{1es}VW{{tf- e\oM"7iV4YOu*Q|"/9CUK@$czѹztvy<=H<%i#Ѹ(O n\r~=erY̑mG^ɘ@G e _Kq8c:<: W唱Im\|j+0A2J&x~YI3I^r:{F.1Q)^Ih4*fmrެGgȿYca'B7kPƄDGߛd09$cnˣpO)7B ^Rɲ,%5JehXD 1\2ʐ{B_o!)sn6bL4,-2%H*GȒXYB)Ul)_A2^> HiV2X~pӼT# YRqQG,(3Z+q%s̪bh(JbFWo?| ^RP e))[ Xg-xi#ieF(AHT9jgsIE6-HIEp|YsAYja%Fj+ G51=ZYP{%V^ %8G+K0@El0p˔H 1"\s 3UE0%D5EqfޛcqR_+K.jLy3ܚ&1IVh`2"*B\j.kְ*kLUʔ Ȁ`HP?t5ݎ,򄞚^kdl.ן$ khN_ i6_0҅u eދ92|<`B(\rחw#S5YTvSyLNUmos~h,`jH#rl2XS-JX"^SIU Kt]RVBa aR"N`9|9g ^FdZ Ȥ];\.ANRp(Q4 M0 Ip1F! .Vq($)81DP$Y+i?WxTras;&mm?Yuq,lg B /d;ǖGgG4"Ŗx ʱNlƖC1Ql~MA:jN!FHd[ީ b{ʅ6dsMsi+KO&(=+'95t"_#twrj@`{ł|C}U;1ْw3XYR1ì.0 kㄩ!$ؠ€6V*ռ.y ;u[J}en; ]WGZWhSBy DEF7|`VDRk)/M j͐P.1) bJ \dBL̏QEאg {NhZO<}_ޭ5vsU~ʛ,ۏ/WDñQ=k Q_C5 ?N]UuHpҜ:y̱: 0st6ߜ.Ш࿧QO;u].2p!#t"9KX޿8ywjw (\'GI1E$t Y+`'c,OX6c ,pyXo|gmG3x*yg/e'}潣TFXט+55D])"p%?'Ǎ Ǵ* Ǵ*7U I]F!&cX!>UQ9 M<ID'sgN\~t ddOh:`jK*霈)#JخF+Әƣff7#Eʏ9? JrٍצDYv~ Ӽ{.93>ۆ[]M`IRξZG/MyuhĐo}2㒜7UÙQB?9:!g/y~1,eUUt@L®CȔI_`JTu]Swwu@ a D0@EbiEY\əDݨc<"fCHV#1Q[J$Y-eU+)EQB`F*L9(Z 3Z1V#C0'`֊uvAߊRՊs1KS9հ kz](5R Z,+J $UUJ(=X; ?A~dROO]۝W?9ќ)o>"'` W22l+ iMUajV^& #)BHAR-(f uYe\I^:V6UPDK'+VKKJ%Z0"ᲖBdHFY7 (.*I̓Ai"&EPmh.'/m %`n~_֙'>Վ?{$Y)j\BA,JKTP>yǶVv[*m4^HjDSH҂ѰH=gܢf; p2yIWO3 v@5'/of?l5Z5lՁ- ~w8ju|W5a\71GS\>76)lr[wbrQqNJ߷D6Ե_,]׈ tǶp.;fp m ;ˇp|hdve\S໲-n=;Tz}}mN$b7w:֢lm߯`Y3#:߳lfF|?o.o7u\p[LU ~=3+NF >5;gnx♒q w#BN_\?>y!~h-~u'e=n]i v˻c|SΫ ;ϝ0kqE.Cs?bӭ,Φ3ΘW?sGCOLC;I&Νףxz:I#)ZFojxvw zד-Maܕ)q7 =-?"8\MI- M"k)im~p9 ڿ:=/ZZfDOrlrmM}Y=1P35Vf0%/q x3mC"d֤Ӭ!Sӎa=vGFѰ4i22Yܴ&_2^"SVc$klFYe9I7}5& ~I2gwUFfv1bm ZeWz#zu)6)MX[&L 7Y_cS)m`X=7-4kXu |ҳ]Uuz^$ E4H݂-Y+I's)wI޵57n#뿢V./S凭':\NR.m%,of? R)@hTflQF4Uaޟ~ Tx>[j\-72V!(6}`ZY&=[ƇehMFwźUǢ );4 yPZr͎U9vß6^@>Ȼc6BP˦VxTTrAOR!^x jde7ݯRffgo3f2<DWN`٢ /s3G8Gר'gQJ-&ﺳ+wH_?@llWz׀QMDS0$( = 3\ņrV#4#qdJ-h܀DWXc09Wp=<]>C9a }.x ۏ"{Yْg% ""T嗰f.!PN^tAk/>P,ԯ+;XIO1i9#qrQ\+ߛD[5Et^;QqWk}Iq|hql'\ 1]&Z)rv AJvh4]?$uu$Dtex#v~6d\,IKR!Ԡ!PGS >Mj.|6o]@1H-U㵄0?4zٿ6 1}" ,%q˜QD8,ɓ6bBH%&:/S!Kdw}-\.)q&pPkJ&:rđC^H+um";Dp\c/?o YOMt-K|pJQOqAfrFj.3hջ;0 sk.pxDao^khㅏB F!VH~ aiVr[Gl[;q(0qD0;um tizzb6lGehg~8sƙ3Ϊsv6IDBJԤ1j˕reaֹ\I1^<߯&`\aژ6j`d:a­N׬Z:D N1TWCf*Nr"d6ͩrqp.y&p1gT2JSEL."bk{+ VA8#+za&ş'GHsww<9ɡFj%0LAp\IR~ ]ds<7Qppf,*gyfU!~cːE*A%4rM-d< Ȑ$NHsU;DumSaxOW ,*𞋻Xߞ|f=IMՂor$/w:5Ő>EV/??%o>?n:* oƚwJ!"g8VQqo=UsJ}A_.QMf|1fތs 8CfR;%fR[ߵvs7D` pɋӫ$3:aטOMa-wk`hkO{?SY@џK~~rO~9o_}5&Z}}SgSs9W_=xQ&OқI܏-,=T(>у6nʞ摚܂a&<$~>sRlP־yr^wdf ð j)gq-ld;Xny,R^;-><&̴ǃ-:>W+Jʢ|68hglRӡo )5Xnɪ\צcES1ƨv1*U`ћRMJvM#HW%b[bBɊpf䂎+h-LAΖ U3K^xkmW>?^o_ N=<VH*4*?=}U qZyhpVZ R .&Fn<R+zr_ͬ] "f;EA+ 7%Kޔ@bOmHȷ/{_ MawJf4͜p_\i-[g/S]qկx\V9>wb3^Vpl cuRL<6ib9X&}6?h|BLiH8+ޟ~Q™Rv\9,|s7^ ɱ[f% ijYˏm`<]DU&:FmADZ㑅!]%+rsct.jqZaL lqMꝶ?Fp2۪Z5ֈ%6HP#ru0`g6Inu0Mr rqj$kaJKHsqml(]s-6Dx/C[$ P{gr> ? /[,c!*? mP)=a $C|! L0BέqF]~j;Pt ~IMdUV 7!:@v ($X(&s a[h| aVJC@7X欓3eV k5i-!8ƥ&:7rJ[X`Xc46Sڙf82AB#Taf%4SN9IH2¿ݴLq:O+V9piҊjK)@81Tc xc&0VR ,XN)jhQOwQ%9pJsJS)GaXptL)D+3o/ME4l,tĹMCVc&Ҁ2MD%_ꨄPJj3oНO J,C 1d*Ʀ$\:`Psnc,KsVZ4OSBQeOfrsuu=:L|H3iai&~Ke$q>](C(Ǘ2+ЎfXcSsҒ9(8Dy31nk!" ɼC3jOޤ <Bֻ48,ڻsHnrٹǏ-΂lh y@$C=!vL~O;Nl: ~D>: AT/9x=:@JbD|{"(&Dek_#2WYGnӒujV&Zw/o9NɻLnI챣onED[UXk񇞠;:e ѿeqfwhDI+ >S;קc\V%9cYv5$}%oD僯&KɾCcI^fy=RO 6ɿ54pLjy8c7g88Dgbe$7&1@x~'ƌ7%%䉮uHrc|$Y@ שc M t|w .gc 4L/] wvߙ;n-^`gӛ5`"asii$NLrgrb3pF)r`&4 4^a1V0&6j[;X/zf4n|g70CHw1I*1ZnZFT@Iv= mͿw!zP1frݮ.TPOfԼbFAfFڌ2#՗Uu\4JV] Ŕ}IDa:V )Hue2HvIm/DsDs3io(_@H4.5" #UXaSlU,^W=8Wxd;-wW/uAZer] 5nбBKbW/lubQZ5hZظ bCA:ƪ2†/ۤ8_N) R"m0 "~&]I\r# x _Jk aJ,ϜтT2^d*Ř60X1^몮k1׵Y?^_}2<0$Ix yG%0&Q*+劺UFۚVn矓l: SscV`7Zp7kێѪ#TelH6V& 'h6?:[82 \z {wmW>?^o_ N=<rZ]8Fn\np;)%h%QT%I:B(*d[cݔ:5E=xMOcX1m:chV;oGo+V(+/^KX鯇Au5z/|r($, 7(bj+ލQxXNlmv<[ޭ M4Ʀ c݅n» tbh1|0-hwBp)~1_swI+btg1l#|qmDdnCw/G)Gh rկT?zM!a;qh($'h(Uc@񝐆h7|c,`iW*D狐{IC*yGEaǪ\A.dsJ(݉s'!)p$5=f0SPkRFZ9\Լ mi9(c4>q4».i61'v6ˮU@3QdL~:ȍ:*%C{kgІYd=< Z@ :Z<0.ZA-&h  |X8Hjg'4Р}7Oa\k5z~fj}7~RʇCtM`D˧zly{~<R1݂ .iknabF|z{@T!Ƕ7\@|Rq՞˻Bc[GpNc xf⪹۟mn 1pّC>#۟vd !3 bʑs z=rǑ2] ʪd/vXaZ;up` KпoVj`fn dyU^9)q݇>*7)h8KtRww- FW-OZ;)qPB;)y4CœϬa`ht$ "ZSUG//W QlYy~1.*~puwlb:;#LV yDrL SX(rZ= )ֺ"i~6b$Ϲ1WkB{m8"h< .(hтx ՅU&.e~gz +Mc^ek,UxmѲ AO6EQ ~z`bW0b%/(?mTJO5MRʨKfkA}kğP$ (P=ߎ}0v2,I%-j+Xe􊵎dp?P2C&ev^F,o@Jda"3F!do3nPM 2v`e*$AfEA#cYeGp,L@vȊDOz@bFӤg#.t.h$R8|v ׏>wjrpbdn_B # o&=2r_B c oځj/5H1*Ŷ@ Y 00!( DI$`S(@uO|wKKV kZqGx~53RU~~,roRs_o&x/[ݚTtOƼi?~`_\}>=i{BaKZŒjujS )Cc稿+698jg9=!|iƗ'Fa%!m?KZvc%x>Kp(K ,iM.ػVK=q#lvHl!Ͻ2[hhe'A3Z!WƌVڣySH}^#ؿu^ר;ŸQYjHdT;2CLhN 6W \cJQ[$g2"`<큱f*df߹|>UbWvG+ 8~oku~yoZ;/rt5,{-P;xtX>Uq`HِS4jK*_  ?=KWF žͪHOݼfoyoG'<9sw?5o߾꺤c1ris/V<:=8k?_[s>wߞ|8\5#bs/-tV 'q?uz>m4-ۋ &*j^K[1 EMs\Nom[ET,spCa<IQ& )K& ]I^e*0&O^*N˳c7U#wP4*2w2EcFAyjuMt5i]ceJN W,'A;_WĦ ^0Խ1gUr ʸC>%(kPڢKKyPkŭ+I5˒Q2Ȇgf1)16jHQ(`TQ 8kaJG#l"d/ A 8WTL M>|HC](d9Et, !X{%cdSY*+  cpxyE^̤N2G*$YXJZyE0AcNۨ.<|ϻ`8J ̣gw̛ЛUcA[_|̫7yN\x|LO.ى.nȔ8BB 2J;pIaTi{:vYrTQqO>c,i)(D#F;9ãb3 {T2'Ʀۯ祗 ?nGh tDڟFFzW/ m8+gl M ED( XXs@"尔m)䃌?($jFM۳?!$0/hw1NBc);Ttg BޢnUT_B ÑLUg'zpn\`Ilh4q0R)=u&*\!((4b£={:ʡJJʖ7'W r3zS\\y~4 `'0[;œ3}waіgvpBډQ[sl/EFNa7T2r5@+S5|F<>!қtkVAW2rGI_0BS]lw<i ఼0 Nt'`W:)gotYF/.Ak# 7,lD^̪4WHKѠ|3nٵMg8j^R wm2/-N./d5㡁팃dk!o/o@gT jp?XJn_-Z^WU-[7n]thVB3_uFY2U&Q A #1Q$_ESB'l͕T 9Hi-yR#O_.Ԧ OgS5pHBIǓ+cbrw &MtkY.w1N`8p˹^CỰh3^>mĜ7^CjQUN=Q*Mau i|_vsI|ZO8~erOuo6#76(0ڨVe~ /.3{5ŻJz߷$45InwݥD쏞F*Ŭ1<݊q.:qƜTƧj7>qyt6\"yD)E'tIhŠJdXrfdžg/gLq̆>ca8X].< gVgGiH(naX`[!GS"]*ѳBVZ&E2{g€3QE 2! e( Nl5B.H} Da1*IMvF%A"cB$qH`6Boq;cw4>^#*4Nu{i?& :A/9KL)H/NKBGTVHhS,Zžv$Qm#^XmH Ҳ#֌TDiHȡTb F(aڑ4SEJX!|*k_Bdmɒ˘j G& P4JƌdXi5~=.{L 1rSmƉid*l-h$Gk:YN[qk {"dDŏO5i4I/m5G1a"%UALALߩ:GwTu@M˹`H-dZXǡuQ4,E7ݑvÚ`Js/ uT[H.TfvtZ_b9.~˹"띄/A̞Тhv wb֊ &5pכm/|짢jrY喣~JG_ Inkgg½S7xIqA̜cYH)!d:xKtv-ϸ9{{Sx 7 R9z7#Pgoى7PW@ )F.%Pd3z #C|B`ǫ,6+pkg$ڽ`j3 9Qr> MHrÉ1jt85ff=YAǤ$T(PrE#$S4TFiԊElZ.%\N"e-șR(2ŐShX-[l@/wmk 86(αDaY$-lc TzC)"e:p=7 nZYؘ=\,EKknҮ/gѠudEd4ݴuH|WOe͵ݍ<=xuS}Q< `7׉Q }+=9+mhtu`YA.;Ar/&0a ,^gmֱژM \p!KP @u dVى}4,P;F#0蓚ӮdOYK.¡lH`XٮI4WIE=u"-P idX?ֵ a-QH-iJ[_ )BR~EMkZ]pqBkq!cYtg ZU 4ia yn#@q 5i7 IAj^@d X3JO V ͣU^ yZ>ϓ'gsy~ 7jvY+*m}fO dsC⏎/yvv:\}@5ǻ+cwpY:dzU?*5ދ I\x*$ AW3X7|^v f* ' .0:rs>Wp#"_(},ʎVUd{?oz7<š&W/f81A~XS~kmq4ڦ=axmp;'yoۈҘc#ҵW=f7靱 |k`N&4%R,}Dz⽜l>Ŵϛ7_ֽ O_/˰GNzӹq{9BYn4\`]KwU{dw1&jZͰ2t)䁡#x }3[OٖSyt'YCZasdw_R(rs'?QvHZ4EI|K؁D(BCr @qu0C%.RԱvv]U`-ӘD5Š gw.&x:yok 5Q$+u(]F>Dq.qEV䂊wsMÁO[Q~D9 "AC7 s8YgւF{#9568 (_@M۟לЗgnz%:T:rWG$h[o?CӼ*NQf2 ~fjUg{.:&jGL4UwL\%V|jx>gQayВvi o-ju"&qr1@]fy6Y]CQkuz5FTŖeFj|(mrފf[qx4[15қAhV,j} v-YW&Pu55'Ya.<"m f[hF\+Zi-eh-(7º -Ջ4Fyթ _S$ Hк  uʦLQ\$xݎ}u` b.=νkcj?KߑF=86%ռ^,n>>f+0M0>8nV  hOdQAe J<.&ըhN1פCE%mH T혒$n﹓j]\N#O6E- ss't/>|%{7j=Ե>!k}]^,l\gNÖZԾjOOji}juj znv(X׼}OT:3rIZQS>"J6n۵ꛐNn{Lp*&FH/cB*qϽسpvǴ͘s!eCsAx(#UB#!g1'n o \wa,}.sZRtR]UYՙ)0'EѲqy%L7AA[070Lr%>@aϗ 0dͿqhnPhv,1Y1SlsR쏪*7s:|Jw?~7/@WwuWfbΉ ##{%XyXNxz!젫]19 ?.'=YE*BOYE(}H" +E <}#/}^H0ZpB{wiGqJX6nvvtznoBvrc}[ʝל)*v=%Z,%nC,&x:yom +ۈ'o~uN\9i\dR6m,M46}LSOP,bʀ1s/`*`iIE8Ԕ0KUQ 㕅FCgЪC%5D;)VO>Y(4z龄omT̤pxx>Ko &q-ܓelY[`g* 4BD)5'r>|{l${X \؍g?gl}18eh-;p망כÇ=AoD+xK>y&q9tinT`-9b0dC߻cj岚;D&Ε*|9lFl\aW{f+ی_adU'q>PZM4R' +>c|ƾu9rտM81ؼpFiù~&(932{5>c$ &x]7910RJ7 JzGIH"z9~sx^O!6IZL:m w *!7s\̑Q-RVe@x㲺j 3l-.;"j X\N=%j8X[J3t. ġƉa3e*yfVO "jBD d`td"$F) RU,:|U\ =ZUa,|s3 -Q hqW 7(ȧ@SP &d$'oI33&듭g#ȑ의&k׌|v5m'0Dbyc,Ek|4yBlpwϣAuM} jsŤΩQ%۞hv=ax;'yo"4ruLo:x WjG-7qUmvlIZ1UӮT* WnUj^ֻU1hN:{p: K33^"S({Up}C.ٶ*͚La|9=?"tM!Q4Fn0)tIpmj﷩dd0Z]}{C$~qe}+ԊlD'k^k-W.C㣁3 G =4#ԏ_Pಆ%uð \)!7~| JDk^R陗-" c(cui:ϫscZgDoB`/ &)k T J#Qڧ+ni{73<=anZQ00m)$Q98™f4.ؔ†q `h%F0E&!J]/ɿ&Zv'IvzHhyգp~kɆ~E&I&ABԘ6-'_7yNys2PH0 %bVKPM.nlz5T,R0f0sx@P?UB~ȹMɹ9TS}SiVd"q4ܺx{谢n0q`Ya=0G.}>gV LKgҵnFbzwqm 8|Ln2O}Ldo|@~_7ؔD!6gL(Z w!w8rSəp5{OCnwDP)t>] =8oABF-ARs2ޠAŪ]U/D[OSʣ2C@3v(\ipGEpc!Xj]cǒg0p͉yx2&I\`"dBA#|;\7M$'i΍FϔTXZ-BL(sFt굴dNJ&V/&H]&| _:s2O: :?Hƹ e¹(ŗZ2$7w4;~6N#X[(1l&?1G\D16, SF\?Ę#C <5Lo2 ~1fnσV& U0LlEnboX%8Svay,uC3Rb gȃʓ#C<Pi,"4s]1MAGa1MQ0*!z\gk="GRťk!Gs㲅.WW?fB^K=|`nƘ6j;GNj|ↇu}`{u} LԛdafYqI@Sti [><@?08 Sg !YR)*#m<`ܘo0w4=.XwMl]Qvw?m.#FJ#oBMp_/ܴM=lQO>~\fg"sV3;Mv( m0&C w7oyW6(KwyD6֨L2WmQjl9]oBi1$x "2DJ&?Ov}ν>OO;3\}-z?P>tP6`9{?I/+>E;_|-rJ`l'sc-+pR{Xɋ)]3=oY6([M4hyd!ܨ"'N  0 IKqV !#X}כ2A m=M=e>\`.bWCn<.ݒyJ5uE$DXӐJASXVƄ+0IF !%S4@_Su? k RU1vMD\&w;_Z#v[leRr0_,%0}-~7n^%FzNl7;_/CDi߿~cz?!hjhulJ'ێ؟x3"n>e'SäYk0].B jyN%Pl9耔M_Ş.m/8B~m~XK /i~ڷ3w ~|43q̌`f\͌mbz5 K\.>@CaV\.&Ȉx5p?߾aM5lfÓ=tSi*ЉO_mXF!GVDN+V G3ws_McMl"5#l>ûIzx ˻<=?Gם-AmT;0s5s9_ t=*%,xg0PVrRfE9}Nn~):_t)ۊRh=)e&8-yxIL ] .'>89u쭭J[X'0b({ İ7Y(oҋ1}jc1ύ*%#X4Jφ F)4j9+= քK%F[kBCjX b5/5E8z^6^0mߏBy)7}p؀8y@.JTz=6ҹo?iC^뾥Ac]*h*5^`h[Â(O,@$$u/Wdf3cfy*G}x) z^|]8Dc\КU3滥Ԛqſ4XSfSGS^MG5;hbJn{bx8ʠaMvPə7S(LWni;og[#u߸_\N<﷿={ώɦ=lƜ خ6ݭZ^#,j⵱@ s Q޼ۯKA:҇4*a*yQ@l~~/2_v; ӝRo4߇? )NBW?U=`&))O6'zYH/]ph'Pƛ!\}]<G<އӴrh93)жA yk pfo~>]=dWCTGoE3zIŏP{~h*a(-O7-c Y$u`c"3Q`\R3$?QZeITfݔx?*߽ߣbHgqt^x9} C]G5nUn"UǺu*^ Vpm(>:G^$ ȉBϧܣ"=dCWH9OȤF:=q%8K莧&WŢ(zYiCzY2"Inj>gʼn2%ZΒ ;^5OW)9AַOm쯅%A].^D #u)LZjZΒQ 8KNDQʋ(f^Myqmv0VsF4NHn|eI: ƾq}¯~hFIa5fN FUX0k:)JaY/#&203& @:N\uy1) 6?:Α$0)0RfI @ XITQ_2&FJ85{I-o%E[`w&-kہ}$n;Ѝ7 O1ERj#Q:x'TZfT!.lmjR2$6{l)h0* VBWU$EX}>A%/tKq 7!4L#N`CҖ'(o* X&ZcZ5 C~}<Ѽ`m8y*u;Y'CRcLe*0:)R9)(ēPyH;TJ3&4$@ڑU@/1I9TEhc,4%M$DThTm#Wi5ʘ~!R*y˰^s<$8`kR=n\GOX*>nQ ~ {}o׈!l4ss֓c|k@; D& }&q//3PO?f~ô3w ~|:P|(_)VZ6ssC)D0"33S,7)[%,l0o9 _M/@΋f2~e&MjʬL=pYLܺ5Z9Y}pF0ߛ]c,+0 ,wW3]/W&dYz"2 [7]4ޕ5#8K}(B3c=mG_/^Nkp;}EJ*U$WaG$Q@D,@⃝{<28JE}xw4zI+GQ=Рr^KxRkX\ DhO%ER[KgD9p!F)x,`Jki)/ryʉTR6NmSu;@ "r{qp+M @iI8YDPk fjj;ߥ"Z\>bA8!?wS=N;wE=N.~ WtUʌ |zq◭R.(?*oS'~r.9ArrFD+㼴Z;NUnH9bp^4C952Ntv6:dLW ,K ͵6W['(^7 0%b#hV\ddXD*aJ$ܙr v#y^kg#g~P 6 B^]~҅(fOبyc\(l(NZ`.p2{x z3mNu?/>@_]6m3<,Zlg- q៮",\wns7{ >2A'c&֙g"wA gDATsP-1DO(h4Ew ̉X˼#bXSgPMRh5`tN2DH+)ɍ,pF{q*"RYע3XKcт_A3J 1 ``w@keePͥfJ;2SD 9qI/`y Hgx\B!\/Q]y%U/*NzN* TЊeƦr M )u?{aOOl >/?\M7˒KU~;Ɣ4ԹQD({)BXқGn|ӄm_O-ӿl7}K<{N'>&nøTW.Y2%{MUr1#:c%VeٶvnMH+#%&1r1#:ckx*wjj&$䕋hLiv[M0|i&1֊LcxZzdi6 !\DdJӃ_nCn1n{lOڙv˞h]քr͒)uٲ%Hez-:;:mP`;nݚW.Y2g՗g007zn{ }̉w˗W.Y2i}o؃Cn1n{0g=Ѻڭ y"!Si[nA-c-KM*״m]Nvk@B^f9=V~jũTJJP!\9rT%x_ S@ﱢL#y(;U5Ħ;[{>R:U\ U:wpU:3TrTit%M=V5(,<`ơV1 mip&/2 ?K&D2:GMhFpD&Z[DHU2!\u S.A1'5.m<\sdҵ( c0Q(2!$)0+~$(a@B*bAR#3X`ZKu _œD eW`#:`0X|0E!e,ǐrҖ~wrF*o$n2hYÖ K8zȎ= =@N糱UsW]xLCM#%%rsQA8a:eѽVdFE1i BTTֳ- jSL8]gܳhȃP32`ط l0Ѩ= o.EGI8CH#Bp"<}uԃaGK4Cʘ{2q蹯7qB+QM2LuOf6}D~~KQ&0Qkr}{`TvK?966ADO摵Y݊h m%70Gbx|uwH d|>!`GԔr A|]@ÍgLN`%Y-,!`{[pY*(D1h!g::f;6p_?BR,NE֜A)ݍǃs`Jju~Lw|=c[^p;J-NFR+lJ2^PN$2EyȾŚʮ":0.: S,-56HQQV?=؟gڻ`ƼXj:!FٹOI4PH0bӴn! -[{+27ԫq0")x2N/C ֱLih mS9;EA8hB.hZJRSj˩~(gzI3řx2qpf RZ#N߳֏aJ S U4Q]*(aSKpTk&59&=LeH Ǝ>IH쿘9ӅrO˽בboJ?֒Ȓ.j7ʂ1mrbz/a j@_d=n|Lb0_iz`cC]U ܚ>_T%N]| ? a߃Ϯ\"ͼ-Hx I92gPPJv]I6 {麘A0΁S~Uˊ1r9FzX"=U 'qT~0Zΰ p`o^@4U;>dp%}V<"xs }O5{k˳_^FPr8 <nʚ f k ύ}НvzVL]kNe4lwϷ9'c夺/%fioqMCc۳J=z|}p1uhXʅf;{a[g^#{$+s5)bM|ը&7c^-ݳ:ͲXMY"d3Mwgh hR}bdɾ+L Oy8+(M$#3~eA,X79WQ N#?3Ĩɷ8x**|ƥ$ !%QoIjЄr\[\}@YvH+_HFB`>;c*M :*$sbocN0Cٰ(=M$0Zn|pv\׶U8*)MVb<-m.S5//ZUM+pmquzXq&0y+3wLzS0Bt0I뛴s;Nx?AuCˀO[HQpw<< [68cw^ya6]hrKnlw1>Ä#AscSբawfݳ-k+{(2ƨ%3<{Eat#_EQ=\Q nXbGVG;AZP*0AECDC, /]G)*/nsKOD6ToүWbb:OW5_.[ fNvW }?׷^1AC^٪iӒ;/Bkt6]ٴz>%iYYVUIڧͮ-i9zqڤvڴι<MUf}$%/ %TK(d~D&Z0mFkusrn/,0SrgD9zLP\,U͓ 5 AvW6cy<=_AǎǣӋ}ΈX.a\Ec[s_*,NVIXI`$(.H HpߚkPeJ>(*AXwdzS&JYC LlIgNZ@ g'2ɝfǿL3fk.{v9 1ٛ0\w>?+m-k;/eUZX<_d曶*s˄AC%r0$щw W2\2il^C=%!YO!= ;aA.k$u'ւ㣍MHnI&xOH}aS>- gz>2o#xXBO3UvɞB|lο)# *TZJII7i#0Z\$0k#*kNWK륤LP#a9S)xM^,OR),HRrx8-m3ZKn($82[F"n,FGVK,Pbn+[rZkA8VE-ɱ lq@A NBnJ4(i\;o尾eDayPEPC.Zpfc2QMNuaaؖ a[&=atȓ`X{ق8V317ZGA(4c%ld!ama}.! `e=}ݙqc{>k?h(C;GPJo˓)pbJowɷ(Y_xC'.%HO2 NۡFPYȓ,'Y"OD4%r>]!є3m$٦&THx AĈ,h)Y)XeϷ9xXiEolyNR.uYœOq9x_dQ@ς鹳4Dc#MJBb$u{ntHbnd|]pΖ/x, Ws\?+8tSdc[6y4s)I5.&EfvG%U ,r S=W\:eqP5+dH Z> B+B/YOrx%K8 xQciz" &Q H2T@[82|D+g @(P$n4K2RyoC3Z*cl<_6]i6_[8Wfak-s8;WX 4(8GV5MYIh F-~l,YP$g{r%DS9i,} mF?x^`/zxS{;= $0m+GqW[p4:Q P}mn*ƵdڳB|{9`[,^5E{2{L4^״DK?< +賡I"}"wÚG-u * 8a Z (KIIZ (2"(|LkbiuY@*$WNr?$efNOl7}WWL凟nyǎAJ)r؇o?>ӹq1d:zwCf_G4Oh|v^;>S_ ǚ2k}48G ~p]. ]ro..pLєLSjpZ=[T>朵ƔZa 5t_R 9YJ&ih“I! /_j"gaIQ2-/lVAÁVj@褳+p_]0_3T1_D-T\n1bvBgiЭ-KTt .p)&24<׌Qҝ^%+$%/4{g t,v[Rҟc1< vh'e9I#w;nN(+z;Я@vQ!kkM΃@?.ҵ}cQ/zLO-[mg+$ba!D(Xȁ]ImÅ=1=bF?ګ>5//'Coud5gN'hu3&3mwۄ`L9s d IvICːNKֳׂwBتib[HCE$*qX;[sGKJZ=w8h'aWzl ZiH뷖T!TMa4 Q''1!z͟=$N;ifCQQ(9OCdDo'( GD AŠE4{j_TUOh\ZvXPkK4qIxm-k:ɩ/1skABp ͵K*M-kT\x0qCN}kHҮݶ䭄ÏVyɾWV"$Ǎw:dב;X4J&(hP+ɚF ^[l C 0G;sR2ER1J҂ uW S )foZRL!DҞt- T}d5U|h!}!e!FtG[ʆSWGψ:VmL_^;EHoxXT C"kPlV\w= V=:$E]qnA)NcN;Ӝo;eTHnAR9`$)A(tӻDn- xrw.r44y\OJSI:1n-ڱԻų"?vkQ?k8-y^-(Ot48/M1FN򄅠rF%N-$#BJz ̆c%a5]ј.%< p*cJUn=U2h!U1/=Ri%ЄJAcYXRϝV&ʯT)VQ4_`jB5C]7ävRs(ҁiDyIF%T5& X+ ^H*،&p. 69lW6 dcQIQUQK_NIsBʂPeV3Z)o׍ɔPH1 E㸢Q*Od-ӣIFsOr4://kި\=UVMհmFst>ӂ97wɎnfru3yO7o(SҷZSgOvFl0W'vzwbn٠e]88%/MffmkZ=}!VLr9D%m뢒+֎WLpM9ИKŌoTAj1pbY mR?nG/l.C76Qd տoW=^:ݚs0V. !9"Ȍ!gT Ikj>?;iר^MlA,dm8j0cLn"DG=l9ZN,|h͵'Zba*!cF)$!OBnFK,E~T^ڱ&LtOU:ʙF(1D'd6*Y9ol3YFox3OP ),(8Ψ滋)ӓO_Z42)eٳ!$5Qi{nxF66{ыbjbf{py`gyDR~-Z/"Dl7v4q4} Pʀ-gZ[l8%=asPaOƒC}U|wMޗ^H=0ub&>:4(F IW;K |؆rzǻWiiD;Ew!lF1in=#JJ8xt_WGB5ȃYuב} g@2i(ɓ'Zp7 ϙ q #OcڏټO,5,JbsʣVd]Ɩb{p6$.CzAJ*|PۘFhV#Bh#EB4=y11P=]mgvX/(ܐF:⸦4\zGĖIl1iFJ Or:B< 䵢r,q7:Y|vݘ *j%@GS #pƏ&|8, = \4 %,0v5y&S'哓3iwtlY GPGZyM,Z0|^j!)G̈ŷm8_'~ kY8%ˍݏ̝2WӀc$Q;Gqjx|fHލbb56{s6Ac WOʽW 5~h:S}  rdl؋wiH}P5plX(WK*`{6ͪ(R}  E AA$]6E4b 4#J0WMWD%_j`f5AoD^0RcRȤB吇IH^:>|tl=u՚WਯR-v? 7z0Qqya#_UB59̞Ӟ#d$V0$j(jlbGja28 FE]QojuIQo82(Cp%VFhKc+)qj^\4۳#(1ǹURaPXqqۄ2d7i4 G@V 2L5A)2ϭ2:T#_*MJMQX*5\" f9M2Xd()aiꈰDrO 67. Q`-W:Էf*߳K֤;bdY JmPfA#)z!{*Jr,F˛K+@`] Q 1Gr5t'v&vD*.l ~vϖ!_TCw~'$<#p6^%>Sk}u T/w|'IKI- uװ@qE)!^CRxq_įEKc[ٻ .EMyVG`WG '򁳳Mˈ1FOag{d'h8$֜Qx_;;]&gC oOWkWCܲ)?L@}MY=SЂba0D50qx1XQw\4X^!h_Ș1ݔ\ܤLR8keFI2)TQqJ#,I\HI8gJĂ%"Ia& UNTɆ5kBTz6M cmZYEQmo좮(LI.l.~Bb-Eϔb-?;ZK^lI FQ#?ÈrT>k /%J>@jZdW}pG^S,/SrTN/$A~LULgj8W##~~i5mϧWނqqwϬ@ R?Zģy]HU6ݍ}\Uc̖JpfR59ݚ*uJrڎ`-۬Qa&j8إue7cW:y32dQ0ư%ol$0m}]˛O_"Hp]jZT opV35open4q-O"#JrjI3_&ץs#d-Fsj 9 ˕P?sxlZ+r].I.oK=>aNkL&`Eh'4 R!ڰ/}TGh"[:RU^ꈇb *6:m1kt|zSHQ/<@<#VmE?_^ z/[/ʗQhƵK65 ,4w\"cX9T1Ӝ0ͥ0Vu<El1:v^7wb&ϲ+V%b~9廟E/+ #(On>")$>)^]I=2ilnз53M]S3W~dF$| a&9HoD (<}z65h65ЉS0j(\1Y[Msn97_ U[GIR}hQ5;;ljfRa@iF[=i#Β1ҥH 0x/|E9aG#E39eҖZ?ZK# MFRIT (l, #c#c}`.e=+-?/Gظ jȼ{s&(65˷=ҜgQ 8堨d.Xs0RIN$0L $`yt Z>8Jw E@}a}M)W5ԭ%l(!.!c1"k+>4 HBSi P1J u1VrV!iR #eL.bhʙ(̕ԆJŠL8 fY(fXI˩3o#_6nk_}_+юSic -cBX#˭=!KXR"1J lF-xj59C}v[yWt +TTݫAy.͊b )OaEJSg^ gĻeONF7vyhpK1:sc ? Ng4 y[?B/->]c>_ ݸd<}ҽy䅙07Jh/twzݯ.xGi*=C䭛X9_=hLR^MJMpz-!F"OW%Qnшj:$(MU:Kщ]vۣ[Q'j_DrBʘѹܠ Dmn9c\(9,uYRb KX*<[7rlwix+ˉPzw/ԡzNjX.xy ؋in\?Ϟ3^Rno:e(i7fr~ڝ-wהrfM\lՔ)!u!ATt2-9u%m!"M(eP9(ny !u*s=yj3eF ,rGșV[3lwݷ =J\_VE`X8Ea=hАā?Y"GO~<X^ŕΙ!sݤc~{}Ud 5,)%gX>X; ,Q}yH STiI=HR,8\ l g)>}t3v9SvCR%.΋Vk+Пt9C2x u[󊈔[Nt̾٢3ut?-%%;n$$9K5&2Y1Qo_y@J'N+%/z^Ĵ k<>yk`%)%Y,ָa4c:%O07gt︿a^EQIhQwx3DlTA>CH,7w9bW~e;7W*b)I44a=a,3'exhvsse4T ໱*Ȍ:e YS3b°0#6ǰl'dQ3tF"8 :sT#4"7lDR~ ,%hPTUؿ˵Rstw_UN5A[RD 뺚 b`ɾ#}ArSHvb9R/D:䓮[J~#a$FQ :(2“ k{a4-X_ۇp\lhȱ:8m򏜪S R!<,LD^!Cc Jtyaމr@.[EY[@?8.ER%ɲѓG# 1cG&~L 34>xK7[ޫrj/~Q%~@~3-||y;<2otGo㹹׶>ps.yaFw//^>{._/?mzFgx/ +{qWm|JU2@;{s݁$k]?_u: 0+OYJ?ޕ/x.o+׃Ͻ֝Ou.mZ|?x41o G`>݀`~4pj6CwanAx:z;|T|+^;o/ۛ7/uOo~~)S Kzw Aў(xolr`o_1I௯^Ϡ~~ݟϯ}%0?}kx;ytzq3A@ o.Gn|9dp;r7q:;CVu_m+pTwmmzd3T/8'`JT$?34 3e8]_UuUuM͍C?Tw삷[=$AH Uem?^im/@>_w|{}+*%wW7LlpsrٗP&W?L&/s^[gax9Onz=sf\Hьg^OnN.W~r;+ &S.^,(r:}y ٟ_߀xxZ \zU%-p :xG<^av~._"ҳsqDŽ*0xZu |Wê:ڞ `7 +Yw /(JgqEՔirpWNjޝW.z#{HfѠ4Ũd8_NO?ݬR\}3pYF/ -1Oo.aMQѡVG.kNzaŕ XBW.C΀I0Cš_G{tKp ?:{f<`8na*m_ܬ/Ss <`^܎:0ئҿUGo{@cϿ Su/ѿ8G_g_=T+:T!+%EEޥQ]o}@]T 4r V"g?`ְՏՌlХFfV:+=R/H+5ꌴIVj~8ȼaC& gY0zlH%ll}Ųet\e:oj}?E7A=Z,wwT-ZsLڶo Ŏڷ7O1E,t#Z͚7${^)AjM^L x. 9?6,`.ˊRYQ*+6JeFb~( VxIABp0L)gng10m U?¶oњ MtZIPlfA77\TxQ0JF eQq".A1 XNP;Q2L%eb춃GRuD6ݦhZ=/3e1s; +BΑ#k Z lTƑ(H(|ʵFb)րa9 7*<@pvGऑ *@q\:}}*֚4)ߒtSNöQ+R?JZPkՔHA2M^ 㤢B{ QillK ]'(̚>ΨRBYp|]Ȕ 1Ђ7-xĶ^P,X#rGkׁ"k ҏQ x!ΣxmXu=~Fk>Zdt5228h1#" BȲ@ cr}Jjcn|k0kkym6|0l@{.Jh\pnGx-`(+@˕X RO"( WK/jD1:tO٫O>t&#)G9f 4Ќ1!GW*~7޿&W-; 6 U `k/)Fr& $" 5`2+hXTV'NۏhLij($rRޏo9NIꁿ)Nw-rU7 `qbOpĕEW |G?^ dwd59ntTwlt|vBFCКġC, ?e H0$`%`"j`c<( D8"9J AI3JsՌ|rH]讍̢g!Z,;M֫$']Syo$r^ 僌S $zKioJUkA\hJ5vߊz:d pxu^H B/Gq$Y!u%b`V7BDfZ@0%x}&WC> +(@`UMf%܄f`ٻ,2!s3ghl~0SSnzS.40;6`\OV6̺oegD?\@kazA"oS!*x0),:#<,]q-S+F{jG WJ{i_^\&?-T<4Ӭxcˤ|@xnw/ڦ!?4 M@J4acg(UPc;| EMH(#1yRcX)B"bLxpITu0F l}aum At Šؠc19Fv y201{8p1{K؞|4'K 7NV 0iZ]q[1q⧩1~Gk&9kc$Rm,$*)Ǥn!$s .Tlf v]wPSkG&dѥ"b MNRZJ'5 9Ci 67_dvɑđđđ2G.n/&S!J&XP"M,FRK Gg[$Fr>0* 級w9Пgד_\}}e Vi.3f X1'N,QE ( 6 [qʲS!׎-kГiM;SViXGסa[lCC1Ҫ?zjzub%E[UcJVBڭP2-Wj&C%4 $i:Lu0 5Sm=UbUմQ(h)wn\Sx1-q[dyb6d=6kIx%2-(ʄRS#|̤[ţ-_%wBf3)C J0Ej CD\Ё| (7Nrx<|m4D Q! +Zmp0Tt>~\oG0'30I X(C>B,HsuL ^/=3X"GAm,@rvjg%C${'TꝕҴ X ;+0a!cU`oBb0olr%hJknTKlՂ? Vb T#6Pr QQofOX&Wv̝0TBb^ a͸&t%w J$ܲ wj8!!/1,0lIv炴@nϞTs)U',D~C| @":gLp8Cpjٔ1}KUM7`LPqi"JL^yG۪{2uά3لVHS?[ӼVQ;Ε"b~l.T7vkr2:T䴤[gzjJ3˦yƤj~IX  Iɮ'y{(M5S'(W?AΆ4=3i8g30u;.&a錮Ψ3Qp07(sY6) \0_ cNnYWEَۧL(TpT]ư"WGxiVnOsr)R/ P5r94Bշ7^dsűLAw7ar%Zzغdv_^nYh*mnk)Qt1NLk!! h\2>,.CѬܔ!@<"/`DQf]d+RHJ4PgX!"Vi;nQAWڔL(b`WY::&fj[mBdÑ]߮U Z0BEi_4{RQAWy><]$MКrG,%&DrQKEAԨLpڋ3}}`.E*O2<1%yÝlן<2mMS439&Of/,0q5I0k#d ]1˄OsFF Q9̜2y"!S<~O~ 0L*/*-֒I6gIa5Z 'K7K3.<4RM][` .X&q^,>ʎ)BX3;i4UyKXIҭp%)ݨ*NRj‰b$NݱȰf$3˻8BuHS ;I.L GX]$Ak/UE^X֊.qtc!FtYΩsbׇФus`ḌF4KwX mQ`I\pH5! UL6ۀ `iu9ZS]&I!wMT~8a_y{ WYxn 3 nd2ZObڟpFfo`2n>#sH"ZpJjHfؠ4QUjfidiF(zVufna^rA>2p[XcxbV s%*iR;_&{31~ I߼B9}xNNi2kn D՟y36ڨP` $X$T0\D0rĨ3E}{sMrJWʭx+b^ &k)ZSWѢ/|Ѫt:5"QL8Ѣ(^gV`- K2W9!ĩ32c|2!`4Msr)x8˲o6ǁ 1hDed\Q ~K~{0OM+Sȷ<#bj ta>FE9/F*#Xh3ec<@XHrB(4|%)8IB VHrSVd3v.XZZnC܆ΕjitΏ ؂ //h`N,8<)ދ ,d͟(wNE3"a>RSȖA ^׽477S鈂q^Dok/zMS<FgOϳglY+5q Boz}h7`|"\sIWq;; NX)`x`mT3ٸO) X”k.A",4hgO0*Fp/\Zr~r+'ͪ (58d%KSHc&h+"UqDICV|`),K07/PF01[2N~{s/>+xtË]Zq6\<6f=*aj*fS_G2gm;?=>|vî5UrYŨ'K"t>05'|c#M_?tr;q^Ao{p67~ft:~||p*<Ǔed(g;YS ƣ&E@1иlY >&Uz<ť3 eM"9I u88R !չ kFFu_r>#0٢~QX Zw7t.XЈnTvڰ₰wՋTĨsՁ ӹFDh"8<$.9\]:\=T@ucjCC^$Ӎh4K EQJs45@8"vOAwZ#BKrW%tߣ2 Kjh _]R@<):"DA|,,\g,qqbY1ͅ3#U.%(K1>ٛu7879-ߴ_2}rDTtguA7KF}m:K}',W#FBq*NUXr("FCF&J+V[Hno˗.oVuR1TK.h䌗_  vHzo[KdJr.K+/%@ }Y \ꑐs)Ku嶖>o'RnD+IgIpIiiʅl}wr6+kb &˰FRX+Ti,C[ZS+ՌD2n9$LeU@\]/4XZt9F! 0U(AaW2c\ ̜sNH0^R2/9#k|3pf0鴿>eL]:1&hA \$n֥fR=?Ɠwo-/Zar7j蹿"xY /E?&qYryU~Yf,olJtcSCK%1i?~e"Wk/K>Oe$ ĸ]"[Δm^bLd6=-9NN,IIĒԹ,EO)9*J~鐇mHװR0d_ji3.-t4ՠѼVĄ} ځPYGѝ䊗J2|/c=tә_|k+ykx0feOxC|!ߕ!^`&qc8d sXcahتhݏCa_]zh\ы}uoq~.{uv6}CpnC>}VEJ&N BtSkP"&%g)'0&w9y%KA㤑a;nqP/% ׎ tlzż֋o!󞿅{mѺg+AD$&k>hthZS-9N/{wH%{ZS=PZ1A&e>\S#,7e(E—7b/F~<5 ~K~{0O!Jtj~}ز B1,|.Hu_H ӍReɀ5BYd`Nc#$f*LXJ2S.\.bL,uH,UcT#K,=^|[]0.}1$0ԜJyA hYf #S1= HC25H3 Gztq|Ky 9~8PA$jڑps36<<ޕB`[zHFMMh4fduY/Ȉ/f8ݙ4ջ3[퉣rީ&ۣdl"#=Pkލ,YxERj,`tzp_S/Y0MƻJF'@nTKhD=#9T^+\1] ހx.{[6^ya'B#} 翻HA0H@<˜"C=\ \h։Ks։AwSΝGa BT†P}c.}kS@( K |CؤSB6%*!Ai)1LD1$@Z?Qh QA"<${ǔ@bkg\P?Nz!Ēs!*:zd>l/ST>,)w 17ŹwoL"qb9r z8篃u\BXu4}H7}$K=Pl ! K{6ltAvƎ⠑@ 2קxm;\~\CˋixϑZ_~])-(QkMz-`485nM'Xd(c:8<grr?4 duD)@M̮F۠̀r11\&sFG&Q%O;SuTAAݕ8;ՓqK]AwE+hQtHU^ȼdEJ9 ̦j=TcKFŰ.o-EoBTpN .#(f€3,jcC=)"Q*aHFNszP0PCj%J^du%C'-k_ȉUf',̓aO 8(Z{sGW$h\%]E)+on_e yOpO/YJmsiyo~Axn_a&n7>Z3"vyzwc魇;q"W*B b )$4LǙTg9.-lR"IHIBStۀ Lwpir.K{oLM cK#W*xx e,653S7}S8?A`8T ຝt.vҥRs==rG)sBcDbp"* ccBVQ8`A ;fb0EbA^qiRw:7m C)!ܬ",o{2=~L$Wo й*{ (ջ a ؓ1SޏDy'ߢfluM G !FܾRKT@vAq}">PFam,z:\Zq,`#p^]r:@nsMϳt2ɣFE̬/V΢q6 MJ%3e趣~E;&::Iʬbez"%nZjf55wx3HJYNG S9~uKKE:Um0Įm8=\TslC/BڷI^RUKOwb TAaceB,ubVG$YUUK:Y˕Tev>[=0"b̭p0W,:+r sM(! 3 hŻkPCjF)n:եxI:ǺgեXUwX-#/RYj0嶵L Ŗo:YCVj\Ȧ]ҧ"E@3Zպ\n7ee2Z);$*->RI^uڧWC [΍{ćĂr{֐|&ZǦ J}>٭\Lnefv_S:^:nMXwnlg7foVW.S2[^du7կOpݚD36HETOv!*-TZoȥŽx"11!؆fs#rJ -Bڗ/U_>8 O1H̭0&Θ*rwKh-. s/2lgrb'Yu.?dՕU4]EG+*Ee~uzZ 3h3زy\c&pٲ{|iT=Yvuu׆1iW9hiN<@݃V1%%=Im Z傶W߷{k s; !+]@ omoY{nbw> OrJ,ڟQ6a\unJ΀/YW :Jv"Jvrxn砒Ku;eB{J`$BZ/ㅽ~;4r7C.*; 4fݕ =ͥgr0 (6$9!䒈R)[_#:Sz}Z$ko0 Xdx>р0Q~T;S~l|1((tP?rzK$@BZǡfNʁP'J M=aԐʹjU%M6'M=2^T tҪ*(mF]%&ȹzM)iU954Цz8?z(8P"Xٛփ=y- rbJ@HAiFnkVMJ cRJ@!#"}J P! y-表71ł)˔S!QX+]}G?ULuUnwf%rRI?Y.2G=!9Ae \ FL-9SG)¦9NPr) ~TIUY/~8YeZQ$9>%jnH4*+ɵ 5/~3ײ%%{3kGmX's'>~)r=.S: +@Gzr IHJ} M`zX5@9X=)k"eW.C}&Xoݙg&As^p禡;؟?M7;+6Cn~̄ܢGYD2W˹ɏ6AQP/1؏=N*R.5hҹu=jAU bߛs*>ΈKn"Q(q&@թjrX !Sb v[ ]aizI]$Ƹ 0psMgM' ڸ-( sN9lOt8iM"q'MF 2iUQy4 D-3B⭅ -2sk"Pe ޖV#!e ħl6UƀkE\b!MS|~/WHŵhwqX!C3q:NBUlcW0Ϊ!`wQ8 F&A13b6кѦla .cĦ>6D6JD;cdU!_ǓHpH#5Di+ MǢ0Z@1fd o@j&5͒w0z ꨀHxtJGZ[Y;JS*F#BEBDH gҗ&X,zR}šQq,)&u @кkpO,o˖_3FS"0]_vOOvPYgb>b-A@￿Xxү~=GZ=O<={L>N+ujoK5I3W䏻7˝Y,F< $@@dx+<MQ0*&_U<ҽ^Jz7%N)$hNK_>yѾpw5ޒȟg)ju<ӣD731J.h9tL]y7m7?|3+r\BYr)#-3 zhSA''$tN6TJrm-/ɋȵJu@T_TUb;cm/EfDWU 0HyDSssJQ:mY%Q{QJFCzue_}'=fUA4>&F\dk.,Wܼ+\\qU Q5P@O%+ *Htuvva@$4NǓ28*µ>C:u)!Q@\ =k'JW^T3BŠQCʎ =hS5Npl)chiGH} "滏i\{A0h9`Dpn@O,I< ҁR÷@4e]I% ~ JlUt[?{6ᗃyAwbMLi ];u>$v˶lQ(ĦR*V}H_mO%þ^ ,I4{zXfɔ^AS.\x pUE^ =#KcbG>.Eăl^Bn{UW- @c}@/A&CU-/AEj{$XϿʝ- wuqD9GaCSY㿌t4ߝAO_Fd{~w$A/r|R0,'=)憙7W9bRl|x1.P!ѶM ?d>y^ LtHLbQR&LHAմ8{))fID"9KFX 3HZV"Gnܒ[S~yi- ;L P(t W1Mʃ )fQ'Zh2!BLE#!@ s9px*fѷ2Ьe;:ЋһPƷ&?g5FV,^uK3Đ@/ V#I/)b#G?~\Lj8d˼2|ڇlkKdӔ`(B  J aCw8ntLu'z RD%!f6y:.Uj0S"5fϓt=:h2 ~9flM#Y&fجoNVG D|XcFp)n>'tY[I0`E1ϒhLnVzhyd^ >ѼϺ4˾+Ԭ5: DH 3u8JG2e(P!„'o6uQ Ɔb"C etv$7Z!ȢKH-u},VۓΏz>W<7W3֣}'I'`hPB&3p$$2aq0$,Rb"PE` ۾߾eju-@?NSO&`V|r7wd[ա0<->Lv%!j=yg}e4B}(a;gexXp,WYVn9Tt#$v 5)>NCLHrC" _~+%|ґؗ0ThYt1X J꼘G `4;AUz|ֱܗ/p|m@oE:uPM YNh1 檒xUP=ENp.S"kG]Dz@`!ZWuNIi%h9Cd< L$d+鍈DlFA8Hy1TZ;^rTYsbB% =U1Pn+G8˜c鸎F}6!0!- ?⛌I#>D1$a#ΰjD(QqR584bX) u bR" % 4RǺALaH0Q#E"I qS&ĺBJVCk85jmS갆Zh9=lf{i9FDymҸ 7jl!źjhONx-p$Bai]s"^߯J^Uա;˶wV#iQ[vi%^P"E9UN^DPLZ),&#B2",Gp%N0 Sz^JE7udrkA*0:qPU9dLSQb58,M( @G\RSh&5c E[GПq%HzV꣒^LYgֲ4]+s'5\xZ2r$Kkm#+nzFh%j,$HXSBZIƱdjFs1䄷HU(bnE}`j+1޻Vշb%2âe0SZLdq7Lڪ* !"]z,LP:aHÑ$!0 R XcBbaˠft<SqRhĩRa>WrJ\ ^F-{SɱwRY<<ƈ4Zxjap2蘌L:0z#o䈮1'R2\ Hp1<] C֧KHtK!w7ݭA˩`Bpsi. yh5ͮ;I}:%mÈݲp#* <7490KS8Bݨ:,NPYzW)( NUѫ+$;`΀p:گNSs [~/M/8k^y% I2Wcy%NՀY Yvx%Xިoh9BG0o݀zЦs?mĞ6m ]6jL=; $@OLPv1bf3B}[ "1"}4D`wz%^x }U-}9y :]#(\=4G>"=y 8wj#9}x>y#2_Vۢ0f&mQgk|^̮U`2W\wNw԰s$BW{/z/oo SmvM[ UEăx(9:wL]b yO{QTS;\>V]uL^NS wRbs Nԏ48qj`$e܀`#VtOp+_ױ>ݟ0zi< ˺Usb._k7D]d`EkӲ"gpP<Ͽ$;۾g\jD,%: P_R Wlͥ r0,_{ o; aiԜ,sw= YQ5|l_F+a vN5ڣܠ<yEs0Srv#: ԟePh3 N+-s'3ѓ^OhVٳBf_db3]Ȋ #vWQ?W_LI 3ddHXJ79 W!&afXltX/p|A?桶Te _]}[m}v"CؙQ`#ҿ13Uy,f%23t0ځ )GKc4/^^𸶻4!#% #)8B 2rt<(GJH`uLB}9 NC+M|L;Å jDi2yω΋g>]1lDLuE;2_/G\#FNh\ \ʊ*ue=EԵ<\],wb*Z'i,^ ZN]x\ q] U{bi+ 'ݗ$*F~O`qjIGun~K?sU _:jդǵ/WI:P,p jo8B=Wٻ6n,WXz6Uzز=ٲNvR*4D IœLAdDr&5I熃sNu-^WĚ=xAmk \4HYG_пZe+Jq+jќD^b_}5%jgVYE6WM:{=FDaV& NO!e11몤vUX| lHJS YdIKda)&MשV83(Tx/&iU&șij]j=zW@<0.G^9AX6O޿{Ma"۹-Ow7 ]@?=_,<0zP$'j^X N$[&;#Wlf6)y.3܏f˧h%8x_˧717Jju՛7#71>G<Ϳ EXA"eI:sY̤WxWu> 5{){$dXZ TT*2 3vf΄I ,SjR/EI 7SĚ37] 6l/¸1>9+&ܚ,yߝ~hIW>j+RD *I0B8!v| 5c;_"Vr*yѮ jԪ{OֲXB2j.Jq7$Q`_?2iUe<U&L hK3. /cKT0k aDUkQ*Gg!m˧Wޜp;DPFwe# c!:t Lvp$fMGR79rMCyQFR0lԣԆkJ{vp8ұPTv0.2NSJ 2a,8 >ɕKtO5pBEcAȖŜK1f<6$vBA!gJ1΁ !U!TuBВ`'Z (MEHqtPfytIPH+VtWzŗQHUHUeFڋTBhL pc(ݳ]9pLX 2aZGZQTׂSc$ xun|Xa<r)M`{qqb/j{d밷ATJ%ifJJw߃i߮"?[~ƶwZem̖rpej,v̓ڳ5%SQ^}E].dBmW`IpyC3PnQL!JR%`~q̫ɇ@WEES a!g8p!`R% @-ňiϱJ@sFȂk8CE^0y8#[&O>[Ofz%>/|2K7T[%y3{% LXEB/X|ι |>cDn;vWCoF눁W <k̿9Z~Éf!o M2䳹 jG;ةL7:]n 퐉#cavﶃ15&Q{A FD`j4 PS2STTYGa^*bD:`j$I|(Z/s)mQkùd$rRb (TPCwكgV k@sbt>j:3N49sh$G1'|>gJ>g%pA4nTQV9kϮ0%߅:UZ2:$=9`䪳|H qԹ_f>V|=&gK!Z[Ai@DttCX7HȅjY|Њl@ctwsm{7Hq%+HGۣGqaFCG۟?v飌[AnѲ\6a՜c;n^=dq흇:IhFyj`0ւ Ǖ~t.TUcDΎ]w{;gX7ZA]ĞӮܣ}I&S?#f!gM|t,'G74QmMߝgW85`AN 3  @M u|xtjt\-'4*k50-[&ߥ/ٕ/_^pZh&~kbUptSӣ!!5ЯUxNeE!O̊8U(jdo6X9#HlpbLhO5 { j(nӦTGj+u`8H w؄YdIKda)&MשV83(4v:֑lrn"є؎gXFL"< *cNp,\#x''.dN_·o O5uKw>}P"zgj*傿G}pL KbƗ2Ӡp9qlѲ˒] YMZ.kާ(\SCQc tSHe)kMb&be#Y]XZ".ŗ,5zk1v@W|U߆]Ȼ/. O4Q< 4 M;t+xōZ+AfcVuύ[rԶqkqc)0+q]vͩ[:d2KbĩRN3i~&;UN\9, g*ӨSsC*]S\/Y_OEGү}9mʣA^9u\O펲sڰp5C-Am}5.q+_p1SD}@_o38WCmѲ 1%K%Zg+Y!@W*+4f7̊k38gd(jE$bLeCQ! :cN&NeC :#Bk!kd5%2!lF,ÙH LJ[W@Næ*^QxU?k&eU:a K 2 Yo6!'PsKR!4u 1JɱQuСa)(5JL:B -(0퐰$C)R\dS2-Y踦ej>?/&A}HTzX&̱㇗HNHC7ޓ^-7O?M<c IQK97 r<~N3?gfZT>3[h 2קGw?`*XKN1s3fa*}\Rv2,Xa^/ I/8|JGyQ8^oᗷA7-%snSn>Pgz io `xUPzâEٌ1fw2vFqIׅ9:'ZcFC>j)-é fd]kh!Z^=p hv3K >$ei=sAO;&}dM:Z u7,mrhٲ>1ls܇u賖nӆ ] s F.:%[xv ZJ(Dwz*/Jݏ3~\mKDfO+8 039[q^nFyUfcF 4.t ", DTiZ;gf9}H];M81w9w~RD[bgY>sqTl_k6;K=n=D1pj`tz11=ӏk|?*,W6vw,Zt !6ԬVhp.N:j vhh]r*rhBbrgՙ\5]T ˗t^4f߽}u@=H#.|i,8Ęnd_)v"HSoOkaAl񗜋sޮ?Q"rEY^ݻ!sbT>=?GPa8ȝOa4?Kп𛠪6*k6~wNu_ 9)"@X]A,/UzD [-EG# X]~kqW%@?t3HNmѠnimv')(Q4MYcE Sr=Y;rzJʅD/q׾fڡx0kD/$o?!t1{B6N .-7Q}7jPH+X8aOf8ʬ ZE,T,ȌC5\HF zZ`D8! $ eP@qj"hQ6$Q^$vPQtѰUbl:94GN;Vݭ< %7 )X^YL.νxGO~bw>J@|٣~ʭӸ:Q~.-&'wB=s/5^P=hsYegr zo +KK+ǕЧ^VT}d~>|upm; ƨzPAf*5KZa)I!mCQB,9TM*c3$y\ye%OTEX4MRF2EBBBHӼ2e#7BPX@Ld 2+e,O"+H t:9Hq̎L;vAI䔡g.[qy ǫY'8sͩn7 FRE5tXA$œ$`IƂZ"ʄ *CJbC:hnݓ Zw'l&+GId7<+r4z\PG(V\4(DؘaRLQ׫Hc((~ O\4ΟLbiiPuRcm>MMA^S̰x{ nGa2E}uhi^C ;|0c|\mXz0~3OZ clԗT7-cR* I9x`ڡO?s_ʰzvt= 6#ǴQp~ٞZ-\돧d91E?z J^fӀ?3V^.*JRt|z:[1jL,M>U"^Wս5WzZAG@ .s&+l?j2XEalu/zL1K3fj ſ]@ho[}κg_lkKQ%Bm0$@LJk:L8;7o,_' 3&9BehD ;as'$Pue57Q^{תqoKl!S\\iHhkE>Gc)՝:I~+2װ%sB! Vp-SUnbR, N*qT@5O'r15GSm۷&fgWh2'O5Z,4dûe#$:F5ѿOfZW֩ Wc/BAdHd{mWkt$&Q`M?)("9 Hk+@N^$ O p)3wmW{6euR[VlG&mܞ„o t<=c^AjBV֚cS]֬G+\F6`f`x#4I#={owE9']JVHiҪ)Aعܒo,0mշ/Yչ?ȆVMc[kIF@zU+3n3Cu2ϙQ/3PKzp7P=1o3)RX/]{5: na5ɩtdF4o}tq,|oUD¬U 39F%Þ\Hj梟`0קF#l`ٔf Y3.n`ϬGjar/;U{a߉vcF0ǒQZi☉hci2Za k²b@'^bĵdxEʳgr=ty<Ɣ0R&0&I2Pp!)APF TqOIK R7.IG&_`-x:^>'k,M6 +43qbR*Y$'1H$8 1*qӘ(IS$MA…a'ōüx $P$` D7HTa18dq@x S,ԖS%T]D0rC$>H`M'bzZ-qziw~ pz61̦ ~] աe@!3_q>ۤ^JbJ9?xY >:Ig~Wڍ#׌]6SYL)/Z;g 笛TF:ٻRl0[}bi|h&KZ_ޙrwSO6#Kg9#m;ȕewY"ˍ\!&2[ Ew(ށ]|prRt֎thFnUEeĖz'Antb!fK8%NnjJyꛚV/Ք8Z5 $QWv 9Z5N`!,^iR.uZ{aL '[Z 2sD;Iᒠң-z$E;t \U^uڡoP,:FYL-iݺq"ϋV׵Ch%iqƩn)D 8ErϷnb\sJ݌ oCe' *jiVY 1]tr{ ~naK>nNUF5-ۋz[x(f"^`P; w^5)QNxIlq{>!!ήWh ShUp3~lU&dw!dhmHPB}8#MF/9-9" oW"|)ǭ89YFIs+ރ+ώ|@s7K2kuDT99Fe ◑Xr!' y|v@b BძONhz<-n8?ҳ A \g9Oci}zI^ J`ЖD۽r}v?C~ 'Fryşi*njv0'&vG4D{!AɛtD!f z0뇥kjrWl`re i{V厠-[9o&H[ I g:D5GϥH'aC]x[M o_itT@~\KN`Uؓ#3̯D'*EH}$™!䕡>PoAcHQBG;*3#'fA]1k%ȒY?TG8/P}T^[1Y37*z١Д>Y0ү3"}ҙfhLEF0@ #s XF41۬HԽ#Η F JA'au'ĔBAرO̒R0aFu>- Ct]nu.4\w-;G# ̽.*V3Q}- Wb9$z=04/PL\ Fq]2bNZIeɈț̴ADaJ҈PA C=^1)M8CƷڍޒkX2udr M_O2)4mMA^UV?!ǧCIDw > KIױvu[/ dMCϪwLefr_94bxQr=:MmY# PAjGk^?{(Dj?@F!P/f .cPds6m4z`h6ݢ֕`o]j4버J*p4 cT0X*~Yڰ暹|^lQ8 /$ dEYk7mSvrsJŎrz ڽ7!w͍F% /b葉.s%- SK_ᛈٔt?h4^@#QաMF=AW5mRuV%G3s]p4wzt✑gPL!4~C'br-.3b#!ך*uIF;ics+Sw U ZjeҦt+;̭,5toț팠(@Oeu.K˽y`S(z\`@BVN;0HzBUwQ)v–c 2m8K/} #$+8͠h`5^ n+$qÑDHL!QkEmxU9ZV6 I(=Fk4gd:(QRbA J۶3ǑZاG9WO#FىZOK1y7]ef Xl_!D (fpx!x4w|vEY#Y7M&pn4Yo8XO#73~';0B47s-QڶꝶϿF䜱윥0YG;t45:7D0J#CIࢯW&WIg\A('>; /Y#LwtVpN8 N~x6$`Ļ%85ղM, ?/0z&L2.Vh~|'YNF7}WORjr.)^%mw\u!'>˶ F;7eLvXF<XHXHFa0}^'KC"]S4QRsx 뤜ƫ"nm@(Qj4<,/BXyQ_f?5I??t;fRi2OǧǛ?LL/|V/qs;p#3,`?`] 8doN1r~2x1 YU/ z-O¼x ި8kԼkF@9[ fM`?|Zi L郚.\0z3eݗ}Vxz$筣rNѬou.DQ4׵ 38ZH6J苋2H;k!^XNb$1IaOEb`̩KMDL0=-tXoJhE3fZKQhNEa; RZI=#Adup9J`9OrDy%WkHEbTZBcl9~oom&:@ "D<Z"-È; %>qN1e)JV8NĪWDGʳ] qd* IQ`yU\7?SC`-LRWWm*YV%‚ xa4Hz!k ̞hCE5CP<:DV{0?&/&y&y?Zd(!J4{$ {b ))dTsgaEǜ  ͫ5[wb UwqdCmñfU`A1`c9)9K9k$8VJhgîv;M#F1mcI. 3<.Ἡn+VCD~|]J|1ַLy¿ ?}(>QnnW塯hq? gcYS '`rd=wjMćeD|y4ұh((Xn.y`&p+}Ow :y[w >\=コpl{74#Z/יisS@*kŝ4?3\.rE6{]e [}ՍioMƬb$ӀQsݘ^LX0@'/ҙ fwܵEw} пV`ս2TwYU!Fh>H{=|L bthA{)A|^jƕRg 6#qbyжkOk9:EɸU1o8rQEײp*MNTCVᢴ@&.vI^EV J@ΕX Vq#'RH$Šx̸uN7Y } 4ǽ_|y?R')ʧ,Q,ϝEC31dLR3pֺD׏h)noluW<.*ecFKwp1 5C_POKao{} 4l3m @2qᘗ *gbP'̝/f">K]#'O?M.rĊ3q?f%Eu|߾6dEue&tƣ!1鸭:&5!;Tho_'x<,Fc|p\ VIx?/ꉣ~lX}f'̞͕ px خpb{LٞgY/`[w_LxOS%ϟ)9e:%bn-n^}SC3[V\U=V> #2_ TgKLg7ejB)I/hr \VxLO_iN0"8C"'$Zыܥn/Y';i#9K ֽi|~<=qUtYuy8=&kpJLR - A 2Q_kَ4zwV\F%'/ ")4^N(ة̨W {eG %bNTFSos>0T;%ب<&&A;85׼ᨎ,`k-_Q+ oQ5e3) wc RY)%^5F(&k.&DCql5.VdI?_.Rw.Uu֮r]x]`Lc*/?+p:]4I _B^`#Yo@VXN-FmvoӾ&]a5I{ 7ڋ -lI\½+O I{l*M Pha]{=Z$Zkt"^RpZsդ2v Z  q0Ak={dh]M8-l܉ aR:,<Og_aXy!e*%#$jIw;ŷvf~I6?C٠NW8m4qeeLPrDG[zZBiwNv1>75(:P"N6&uA(r#)sd+Y{#[_$yO֯F!7L\-|, L\y >(q\^_@s8%VwDa]ϵFdF;/oZ;JZTy*11-NNzYy^I Jr 5IGRU"ۡ3=֗.[!@3r 3Kmt( N,w)e|HBWZy7i {#i;3;ÈG6hH8浍vÈH@%1dcL*GL9'u}E ͠f.*Uu|5@hQg4wQdCsuvۨc0^2$h [QEP;`:QZ]"Җ)"qQ^`'CM)TÝR|5 ZKϸD.!8P:3< CcCI'|K Bm9Qg|''LaG |'[P')}. uЇŮ1lVC߆@$|| Fa#_-A5>h b_71z܍f..?.V5."s wN%_OX·1Nw`_AZ YFݏn->t TjqxQt!yϬ23ozV{&RNd˃ H<Ԩ=xRgrkk@ Lpdf]]ᤣSYżoq=x"-cHM( O5-{dnGg'I޶\֛c*٩b ju{Nx9}(%z ĩɒுԔI5j'!v"ÚYYCS592wt&eKݑkX3-L’ƹ.^?tK5'I2t;&gK#2,׭A,qJR'zRC ӵV)5D-ruԓC"?Svc,1t=Es0vN]quS1(\0ז/e"כ/Dӊ{ߕ]ڽZyf4^c; U-uU} n`9%>1ȾmQzؠV9 as}`giQA2iphwWiN cGYlJdUUUr3e5B}~\r~|ͦˢ]nSɠnI|e~5X,9XGA!#]fޮ]cbjW)gRFV l"TYal ( G My"(6h^fj :F ,O6JR/ N]| oTtM=KT!%Vs3X;42tRyC4xm3o%.ދO{oƋ EitX /WCz" )][o#7+^ͮe/سAp2;q&O90&Ȓ$O28~nYn]-J' ##dx qS$,-b)$˹'LG;2JʲsAtjbKQc OcYk`RהR^?\ T` irGB4"*Z]>JQqϗ >\HVyͻ_ykN QS9I*dv__% d/AYxOڮ_RK;23#EIF|TkB:7խ5Zs)m0LE;."[U)pTۻ ^ٛb^pzR-s/9tKtUG?B'NV*WnwKQږT'n[J"}Ʊl H썍<}u: V@ !c] NDUvtv]! `wr ΄t}2<cXRskofLG8T b^*~6 aÍ _ ~N߿aZ@TC[2yV-#Hzv*#G}vqԫH=UN-@#̲sBB"&cx[a;  Xd^_}L{Xq@8ZO/ئǠM:Zll6[#[Rp3iΧ BwGtG(rZk+:k #Y v <<8AtiNU.P(P-*CQ9qs }7:88Fnj"ۘ/'LYP& YL&nӥӅoY0,1QF`xG!:imWC"DŽ{5/8ʹ@WtSp]PpGG3}ƒ98yI@V+\$_pW^!`(o A޷dN_JoT#<8e%~xG|]p .NtcoBSQ &/G@$mlu̡cMdƶZeh&sݵN;h4ת] T0QLR&㌍ s(Pacg0 E1!U6ީ㔠`C׳%VѼ(aRRsٱD%b1&wOj+JL,y9 0X TB0@)CATe(y`RTJ8g)QpfI!Qy` 89d#Dbc { u e-M);#46sZw`)VU)4PءXCjLsdpEBix,9@L !`8yB,I 60 )kd-SXUVޙx|oo9hAAt~FH)rdL ΊtL?Yu6)H;,E3Z@J9JNsftZAևLxW%8^zJnWB}лk W\ϋW><Jǧ!]|Q_P]W|z|\y2Qsj~ /LSf,/ Oөj#[ʀP"6pG̾|0})Gj(2"bNU 1dZ0*(ऑ)v}E!v@ydGI1mwmA{*|1k  >ر鲭Θ'͎Q 5Jҫ7p k 9pӕmqq$*BMק>K\ҟvf2ffsw@R]-A l I#CPh*ݗ sI&` >h;#Y9"OED;iQ_aAV+!j[W 0IBwu"j ['s]Qo9>;=gN mr N0Cl$llJw!@2CRHI r r Aсh;8qh+$Z\WxtOГt },<9 Ocfx^ ro8Jq-d<΍OBƝֵj5/v-"wY%8ͣ},HveݎЌ^.T<ξdX*5ןUv ^λ\v%{,:ՍE 돟o 5xR;ZX?,/WVtSߥ4T#ʁ~0 J[L>^({ܑ0AnF&Ģ/zx<;Lk/t M^Q Gb}4gY=^%VPҁ3oWn'2b+uzG(Hdb5(@ ][r3Oc\'Fpq~!YgH,%HBM xbPVS&y`N 4gm#Cև"8C0=N tn1VE9C2'Ӥ-p~ PUZT? )I\֤ؔηЧGJN!!{N2H3$p wK!FhZlV:C[AX>Kf*ҙəު;+ |ɦ}vո7$k!~SXuyڞZOn K _LoC:ds@I}>TΊqqN٢{es&Qu v>=V ˜=,!V+ 钥Cp`H(wP&$ucA< rR@k 'o!7~맜!YI9~>X'[KQe9LVF (D IaQY1Hi!,R˗˂T' 9吥T"'`8(|X^ @ k%eB3n3 MYЬQ2񦱼^LQ K 0.,9eIARF$꒦, '7/vp@fdv_B~dr/e>^~MK[vөAzge=%- I-Y` (` 3 v)XQ)N"C3SâBFu{!C,ш!8Ԧq6Y)ΕyU095s{6s:Z>_:긓sߵ#vh 7;Ia_fQ[2Qg;tH3G2D)G"q@$ԟKfg',Žq E#ߋ"pO2~v=%nGA$BUٶ?HNHcy&hc*7NIەk / Q$'2d FQJBX9|95NyT@VHnIN:I?/cH|&u{{_(S}FgeJu.p@*Hzwb]W~Is 'cI :D&<+9HY/1:ר8eT*WG{qA(H0$jWp0'jWpKv4rBy:5ScqC„xcC&l,W#E xGRGDrl Q@0gpM).:N K_Tܭ=\&TWcyY]l~ ի+UmrWl1@ ~q:m `M1QzS8Z.c\MZskv#;Wdis-h{=zh^ԈPZjx>*+{hoEK cOgc:3P=|Q#]VG%f$dߗ c3/Ɠ"[_)ZiJI2W8.x>{Rvۃ尩YLΪZ_va蔎ܖͽ8`%2ۨ{Z@YN;{ea)(2V eDAEHyHYœD;yL^mZ+6ԏZaV"RxBT7m룼M$ Lv9w'z$bwf;$4%Yz0&֥_D~ɇ-;$\;lI˱y,qJzrk_'MG0cIkYBm&DԍV]m8+?7-7_uzg AzvaJ{nXv& Hm)Jjfۦɪ"YHCcU¤;g(un w[5,q-gŸ;_VO vNC:1ԹVU~\ av;S u8=_'φX %?r`wPjƄD.xw#ɎtKZBpJzJvC0R5;'.Qm\Tɒȥ®) )v?uaVk"nL;ʸQ\2*j'I è >yqH$gho"'"sFy Ÿ@B"U$L8È*)1y=0R(m9+ g0,#gU%JJ$cGq !a78$ X')'SQ[Gò{e6c~\seOz2)aIS͌rz/$V֨,NZKڧkV'2DREUP=l`ITjhbh,"cG u*aՉ! sIpj̉sPj{bV߬O%\"zbcœ(Dq& \GQq8fܨQ;IOlL.4-?.[{|?WLL"D{iwSpyg>rcΖ̼7=.?3ٟgFA:{lE?3YaɦA3)}Sp#E  =_OؔI9g=/q &R01-}A)υOmQ'ʽ6Ӕ"@FV蟪#d~KmR܀7Fp~'Z 0q7b#sy'41 i%8\L8`B -o#:rNFd:L NjuZ-8p.8j|:7z~s9ksۚ#A#R]Sh"8>^53^~U׫P޳`=$18UMDu>APM09Ul$wY/1:WT(Ό(Ղs΂1'`585w")<;TzN:ʼnPk>Z9 i`Sh4TzX)${7w8|nM< dKU Alp}q4+5-1od>t}0ݺZh;=-x㘲D`^ML5ۯT `?|VsLm)c|ݞiF>z#LOMS󊹄Rϸj:m͟H/nyTo -Zs|PZotifCFnuivH#6إ8-3?IΆyFxn{2gS5Yw[2Q1V_ˈE$ٸgP01S-dXL'coBw@U7Fiݸ|՟*zOڛ Fڟ`%%"`"nt}ꌄm茕Qsr0lwe+Fϊ0"yD$~XR c%!RְhYXbybZ(D d B))ɼKJeA`$ F1X3h/)4sN} U]8:#$BȖѱVaj͋ͯ[S \z@OaF?Y~ݛrLѯCM7H_P3hf/9090909pc#Hȹ4bZ4nQ$S F RHuFjpx$sln<z:z2==M+_f3٣=?[9 %B]5A50p0$#C  BWIh5.@@p[HK8Yh8 tQ*tX|l񱒃B~әc8\E X:M\l#WiEVNMh%ЗO}( &k#'ˌ[@VMinIZ [~,] ^Q_5㢾(=KX v~9k)Z]ٿc5ș!4ȬVL|9!wJf\dϰ%MH sRky'ӯy1%{_C-S74wm#EڸcUzfGuoxVۗib!H+6Vcm =uXEqwD#'r_H,`XOb,<>ax }Dحu,T"!021PCcQA q^5H a&~AÝu)2wVzs; çtfLXS4~h#'MI9:VoC )Vɓٸ6 Tyy>gkxcq/#o={-4OzZOcS2D㵢]8W%Y]2v5k"dI} I%LXw "楰T:i&fe1KN!{f8^PX&#`;߭$&WE@٢Ɋs?IqD2 S}d# >& &Kvs/q(pET(,)JU9P3`f,!J];P#%ܶe43 M.#+IXKr&O5)H||ȁȁȁA^#E#Ac!G'i)*FDL Pڪ xha( 8ðpHY}xq8|ʆ"2|4ief?.vѢƊwO2j4ҿGKg 0Yo{?T}zDs(6H?G7ڭg+1tNy9Xscs>>=oм%dkU[6^Rӫqun{3\Lgŕoe=tf^WM^ix6&_gmzs]Y?C`v?!޼A>NٟL{o5هZn=oo |_O {oT=j;gQ3 ?1#@8h>?v&{ O#'_g?on?=ődy< &ۯÛaԈ~'/`= x6fk~oGW'}+x4z< UףWSċb`]g_'Щ?A/kT8G6 U 2c߿<|;y~[f+ 9&{Ӣ)MUNd@ [Y7/rfȫ5[)ib^tp`ZCt 7E% E0bȴp' %4u&|9`aysK X>ǥ盳O }i}sSjQӔZwbdH"[M FJXGp!@x3zN#,Lw3$eZc.#֭&Pouש5xH**d0}5kn_7j p5I?Iᤗ*a1H"= Ћ =,*sۜ4*., Y8t~vNXͩ5"sfI)X'( k` Pfpc3qSd%HBO$KIGõzԅ5{&^-R1HȤT,7V+c3}@*JqP+8.3pV hNnFVy*Uõ.SIVg6&v$}KPW]TW k,̳ΗY5YZUx ryO¡I(oy %Z О'X=&Nj)mvҵX iv9A4nul =d~R~'%Z05?_apomC׽/SsXe\1hͼ Nbk {a Fı•i0N%laJp~aΫzq[@ru,=T1aT0a<R3i[ $a<^ZV2J'1 |އ)}"V:=Ч!έUbWъ]E+vUb\  !9p3eİf"҂sJ*-8gX N-jAp 2yi$XGZN6:\s7J_''ؗgͿ74I# (P'34IcoxұG =h[KDpaJ!' uҩ+A:CTCOH8O0ɐ1 37H!3 <˜!0 86OFjTA$~iCm~Uۺ8~\^ˈ֍b**OGX<c)gw]tn`nb+ş:`4 ?i1maR DƉ3c &4[ޫhy彊ly˩3! BLE4#f:6:a4p3PASgu_T+uV+AmChư˜+Ac4d0!5ke(1L yJ#C[z}ӆ.eARACܡ,Z4aq 8ʀeA81>8v0c!Ye!A m/hIau]K(B*Djw۳ "#^'erֵa4Gf9 `cxomfJF'wl6Mߔ͜=ZDyA$.BˍWT2-%>ŲEc챷 Eӿ_jv֫M]-Lrz Z.LUH:굺5ʑGi12Y^N+կ AO YT̈]ғWݭCyM+ Ȧ] =]a#Њm)%x`fEfSxlUW"sD"sVhZ]Φ[)rw BHC^ZOVB~1kBW??Wtu{;'k*K+jmڢ/:^m6wϻ-}yIvZ}V7&`bXjh֜z/:l.}[W}U(Q! epP5Y}0&T1 YY`(CūUBK*Hth:b, d9>9&]ݺ(oFS1 *>2%lrQ\@jIl_?ŨHfbI·mk1r5zKud,L,43Ȩ#X2^]^5n+/߇/&Ȝ;S3cQ@x;Z@( +㠕՞IkHXuɳQ3\ 9;","ߙ۔x|y6 贀tu81O8?oՖ͛<{W|Z('/}2wu~_h2=@N \%<ߗr}>-y ^1k@نoeDkHBs%S\^nN9v˃ѩ*`Tkڭ6ڭ EL8׵G Et꾣v;*[mBS[}‰a0q1]]*B\ω(NODuѻs nĴIҨ<;`~v>{y )$gޝx#Lvx[32L ByhVXZ$;*0Z% WUլeb"VMJ[D௮سBΓb&ۛ(֨xzAOhc_*h?ki*!_LSnK~k_6K~>CK`] ʑn;9dhDSScRQhx; mR<%#31L<=bkw-$ o'jP4b/ۏ.miQZeI/H9 ]F`jK |ܙ 㦻;RQڿ;_ؚa$Z |]C]J1!†IFꞾZx@*af';*~qkpK(0my ѐ\͠qg{*6W)HTc~_²t{s{.WQt^Eѽ*n7,&(-UP^*+AY04h-cJd%.eP|s"w]+Z+U}{W^K{[;pb/NL{cWA;CƲ3fk B-]Z [5N2n5 *]:0!i[[OYuQ]nI~Պ| AǤ/Q&]Nq7 aٷݳP$e1e@cM )]OXB;iԡ3 K@YNzX7籨ǻ]Cya7ٱݧO=1)hcKJL+kt"`8 6oݐLjsA)`sjƘRJ+[FJPpyP JZZ={|N8%ɩZ KgGǓߐ:gD`";ӑR3}Myk6"AǮ a4J t٪wXQ@ޮu ï>*F`!$ٮ:CI`q؟uNFS맸3}ى+$Kɺ$wwea\MmX.bd1Yf!2U_L9JU>:VĉȦk9U GDz}<4x43#PWѠh`X!5ԪTw۩AmEN?~ֵ3Ƒ5(bIxM)$LR)z؄_z3A+QvifKw'B 3[ q)bV 3JV\&HSLjqwcɒFDϲdGu>Cܝb'L,(W OJ )Va+!MhZOM&SPrJmΜ#ӱUC3Zf`3+f"hB`%Ćc%~&]Y͓}xhwt)f߬fdAf)i o9D)ێj+)RPJ+"*P(#=˅ɇʁ_p`4^( >HJO3d0[Lax2%aq%<5.2:4lTD.PX~ la%lskl Bc?āXy.JQBBP 6, (zQYAL 84WDsp?d* #O6rK'^WbM)EyO@[i.ORԱt%~DUa^ ,(eлSMr%d(.Y?AZZe?<ǞӺ+wFN^M=\({&|ο׋?͐dYE|)10[g 9`NTf4[|ySW$re9A6umG Bˌáӻ?rNx_ $ʠmaX"F#cfCAU'& LI'@ Mj;ZqGGl\D5/@)6LEF}͔&z2(H꺃r95KK0x+s1^/ 5fu]fzk|RwKJY9v7,B;=kݘ!Pk"vdk햣m&S_TB/6"W'Z;ꈨIvP!)1ɤ&`)IA1[M[Gɱr%ℑ͓pkJxMuB9%l!97ַRND5ܜA˜ Hr,?܂_*~( q95O@sͨDHzϴ 16 (2yOB]DivulET6yR A1:hΔj%xMŏ Ϣ$ 4?GlWvz9 Gn!Q ˳'uRKVj&R ]Hw&H8a Chx(_F/Lǘ5a3c.clNniT 4X|͛-mE&\zvgZH u5Gou_pn!.-w34XBGϨD^X-zaբY--ēƐȦ@p?Pm=@, Ip{P0^(ܥ}ToQdur2"ʚsQ%<qr@IB3JC k@qM4PQ(ЍUA'MɉUIdp]c%b$2恴BO!8v2&K)ZSʐ _ &KɁ5cT|)dRXOQ-JH!34>.5k=Ƀ.PF kM݁9Q"C6—&D>4% fS,#!+yq̮bds$jW<+ı+(ĩ1`h) י$֊RA{S(Gb٠yVH$ q+;հ@ >/%;bwGJ(/ƛY9HA35ÕG hiWFЉJKH.6XUrĔuBR2)6 x+>+`;\ xX |wf8u-QU jɞ#sYU<)|uJi4 3R[r2֨33 b:gVlz v(2r´;S%#7)_Jz5+ X>ՎT:!bmK1*9',g~4r6p.Ʈsڬ@{kF4"F7NQdI/o]wae珉,@{{dS^. U X.d;~PH٤L2‹x1^.n5Z\j=oRL2I] &.eN](p^yN}y%.r{! ѿjswx[-k@"`VoK??6_n%K߹\zpFvgmV`egv%XruUms$ KX 5&dOrn>3L ~VDlOcNcyƴ^NQa.%zi,>QLiR^VӼ5/n1MnzXN .@jď,'jJL2_bZ;t)ZeRx* GK) wd֦sTO[_<'PZbOb p=ْe>=q~VZ9 r /'%~;ƵiƃZ8?[ kA{8q]; bc,%N|f4̼F:Σ'o33^} ՖQ ]{xR!lD\u,&4CL7>>!eؽےsbOץ /{!gsLI Υ}mJ<"C*|t=?!( #9%`jlaA+? Uف]CO2t¿YZe<`|0Tw#X܆\鬙(X!`(;e2# EtJ"v?E^iM7Y+p?C}<ϧzåE(̩|eһt f1}euIL,WrI<#̒@ksJ '߁ۼ`6Nwƫ;^VH9t8J)Owe\-#r<Տ BWqo:5WM@΅!bN*wld*xGC'w ^i~jۣ屏<6zi&{)Wu)?Xl(<={cߗ(Rtf % 97w;VDpއ5.僗\Ljj7rfȑZ ^0 ~l;os`o VSPj#8Sx S2r 7nl3tpֻB\_-~EYlt Ʌ.,2fQ0袋bQ6`MO}jA+449xpa8-|~&Oe/-r;uUl_qb1f>V/8@z<7]AB9~/Ta[ 7"ZMfcD6Gc{uil}$Tys(!f&J<)x-؄RdHԦGDDa+rɓGPA>HUcv5,g5K_a@lRSMpq(eC`gwc&;v)H2\ ÷ҏ.R*P!+r'u[hýyHz\:Uf$>Np)e|+qΧ I  kh9+\)<7 ]rQXn0F0d1`d2"0dÙ)(e diTYȦ~JwV@FST6pRB2ϏϢd+QKA*rp]ռUdUU$pF7r)Ғj$B!,n8ŬY>J T"?L[>֧$(TXhIrj U:Z3@*uVPeQK;3 Sg08-Zke_)e-0ٲ$ކ(SXf3Ir&3 cBBytF ue4![ bγhcGϴXN>DOxk2 S*QgvVP.[w](J>?7b)'h/9[MDUߤwP}|&@~W{dU7RJTB]ۻҁnNFѱƳ^""V9? ?~ʼGrhv[KF#hww޿nr-}*v5QPXGIuje+ڊ Z*]%jNNdmbU/idU|j;TXsVc̪(*_Yd]N앲lViH< chDh tdęM Ȍ|IN-=yq  5CVm"kE"+QD'Y$ǡ [Oֺ^7%s;2~w.hQ3DKE/Zmߡ4i@:̎ =d-xϨ-_2n9C4[ _~u eH>w X;\vH%crCyҿx[ sD0d$\%F_bXc@K!7/+FQUn1JAp3f@}(TQU0"'賤9[ Vs[@m4|ʀrf* ʌa̰Yfx4LPYfwz`~g[w^ l։`ed>uyn^[S{:+L4&hKC4me)d!$5쑇\1AGBO0zs ; tF7R ?_V;_B`Fokyײ鵈h2;n^id~g5/*7aK}YqZAFcH^R.ZWa5AYb<CC0j.%\*rf+>kY{:߰."@+臫!c/[Cᫍ0/FpR'Żj (}Vl'X$Ǐ _L2;͂>eQ0E\sT0-O|TɺȵBv:5!!IbRȄ}2WؿbQ~wg?߹3T17ID!yv>5:]4]ΝxsqFzM\?Ht0pH; 6/İo:MPtN峣͘! _t07c04@y.L&KJmDNZPQY"zpZ1^@U)ΆA_u+~x06qziA}V-W{S٪$ʇ*z!d{>UuUwDT am6iL93l@ϑPfki4LN>F/@Qޔg;hd/v=B>Lwyj^R@}-G2PM@C\\.)Tju(Wz1 {Ѯ%C|}ՌI>P_~GwK{]ٝٸez^j1v1 :0;TX%R{fMU "iWf^R;n[su,~ay/ a%eKumʒbDf%kA>flj H%U[Yl@%(^weԦA cAYY X) KпQk^E4w]Ѝ)F${8&=1V׎Kob=~&\l$l?v f- ֑n \@H\ _R)͆!UsCjpApo4qn~u'ӧlZvFw;m~p7*q qgK vْL2g-Jqg5<kC Z^ -ʸطi>サ ez_G \(%pY.Q޽rYq2eՁw2Ch2e)e GB}uޱdK{LeUYM%紦FepۈesevcXi-.OKp[P),T ,22LFIYZvp <FAKQ2R']gPN|Yxßw7.YiѶ+CKy_ޮ[xFicayaѸ߮&Շ燉.OW7wH߁$e#HF\79MgLJ6#m'@3R=4`'#Ӄ*!h Rˑѵx6Z\)JM+M5@UxRbɁ\.g-M,*'bzFGPyEL>;0OmؐoE@|5piD*dZ#"d>Z&g9ҎL 7qz%kdt9oƬ+Ǭ@);nAip?Ԏh3 _<\^>;Z6ʑ{n l+4`x&DLh*jYn, rXucc3ԸH,waQ -J!rVkvxPfC5h](sP]A01 AsFE`[ 6´&*ϩ2)Kւ5cu,L"*&(糲%zzjS惱5L?v)`{-zV?,#&(X଑@(^&*s*F3ӽ:4j&D[/˲/R,Kԛ: 夥aIKAȐʘy)f` JhvԉX׵Qƣp)*;#Oa$GuZV2PԞS,X 袨yW#R,rgNre4Z܉$8SiLRx<ڜx q!@23A *1S:y/2y1Pާ†9I:תo\9 fL$aSde9I)Z9`Q%J8uoI*ֹ%4z+&̔Cͳ9Y_G)&H,T 91 .]!yM׭v ٴ9UVaٴx9*xq0>NW (&Tu`>Q[7y>>GF:WabL`l"p.'‰{zH͙Qk1&lƴ C͗Q%5?ف "gY?~KH_:2zǹ`P@N3]Zs&,ug1Ũ/vFm7Rt%+H_]?gt^xo#'ǙҊ-|oiyvEݾ/o¯os. ?r>ٻ޶$WlR}:bw`3x XJJ6C]Cυ4Pb"y[tv+g{GWx%,M>427%Xi]@M2JRCžv>),0))d&%ֹx$ދ"MNptXN#z Atbd&(Ɖs@6 /Dɱ3X:1mXQ Vpm1{J4^&gL`f^DGta0z6c0xt%&Rh9H%%h e,@'95aNPr^*QA=Geh%n9#Xlo\H1lg6=T17W%!E@hE;uU޷?].S;uWwbxtNgN?g سr1ذ|b5^MF>Ě'hB\$8^;oϛX;4dȴgI'O)7xAaRr&B@%!wp:iűYPM-';VVkXe|O+3ӃvVLہg+gvnN,5` W Hpsnm+W؝RI{S 碽 Y`?䮦R얱ԙ읩X~1,?گ@d/[g$x*1/&۫y%sƒo>Y7^L_Āfcf]q,6 TCo`-{ÈJeԃN2_\y:|q,r5=^^AsV}.vUd[h v;6ȋ3xRdu&Kޖ}DF_|z->٫{dQJ- Z~d?Z3YT^DUb ˣ/|'N͏8 46:4!jh(o|Gq3/+J *WfwO9D;G-g^o+0vg;oϼ+ZV_3h5eq7gG0|Ѝ<8Ԑ}PQv89| Uaɠs5!t^ Z22N[zat/Ͽ-ᗛ>ʍIq2.B/3gdl_T/h?qq* U椅{:[TlUN{ ^fO)Oz)JMu\NNb4n21uv`l q)%_#OB%c)2Ix![Nn.BJ3pӮHw9)k YRiE]OVkaLy l‹HTȕˢsb(1% F'2JbZɬVJ?:0WڍV:;՚EhEQy+i.yP!gЊXgD^D2Yn)xB'z%ЪZts̘ v&/ JRMi$A"% Lj U(e'P)tڴ5!\c5)qVbn}G$:'ƩJFS-$ʇ~쎡@呵v\; `Ng;P-깝_mdE U7"lT@{a4b-$Fg6N;6\ҹ@~ |D/$5y|X*kbHkSj#MSY0d4]U| w~3ٻS#h<-薔m~W|ft)a (?[/: 3NjR)gngWK`,ySrW6CoYzͳ^L^N\ un;|iy7-/e޴oZ>'$ ̒$J^thN7ʹ K A6. fPnAwGbX@9;nA.30Twfv97W1/>mE;'9~Y^gtvyvsγ7<=+[4?,|-!fSg.S:**>jjL)15n# ]밷@a ,]EP"Kw8-psOGe1W:*;r/gs\L6)AwĶʯLZx@mU82;c T4r7%߯S#J9Q Ш*ZUA\~nA$Tk_n:MC>4>E{u c):0./^.DnU6.`+Lܻ朹؏.w7?1+ Դ)`$pG&K@b\|ǣ ƿA@t(Y5 tbbD -:bE},Qˈ˧_SYĴeVPcLCFgkC`DWc3`YPMt(zT^89]eCe͟<^TkۯNnvkiHBH7|N>uKɼ'ԃ=;9"ZgN2Q}%OǑ[9IKd,؇;Ʊ}@`\OQKlu0 YбldOY&vKpP:&=l)KtJ~oCyXz@|H*Kk2j[Nyy6]˕iBEq6sZ'U[&HZX zg n/d XXts(+u&j2;eKͷ%Ik9Qi)]e-Gͷ4y5+XŘJǯYL2I$/RYiZ:ĥt:&5`y^IE9 旉\Dx~͊/jU )zL;'8xVƿt4<2HwU,QCȝ_MzxX0/`vt 6R>aJB Id'ɆO6ntTBZE 3l*W@C4FXn'7l3E@̪ۗ̏[ 3VШe>mQu%V2:(hn-JlKc33k+@k]T[(3b#C\rn g .ֵ֨bKR:XJGY*xjdªI{8Je"P9+s*V2bQRZ-eVBŠnz}eeQTF\dZ VDKHŀ!MdU3`jAUM3!ˍZ-RVHtc`W^ Z=T"Nb`!Uf LG[g2YŸG 4j.z.'Z/I1L)Ӈn6ɓQ6IMrVŜ1ߺoƉHnuvoH9~3O0RK~n@$T~|~~~!b}A<-7wc6roM>%8ZZ qq,6B>9 i~cK?ʢ8()Vםǻۇk,?)ФPV&XKd!䅔J֡)qJ83?Ƿկ\-߁{`^KȻ2 3rKN:U7WBA .1&c+܃TNq~i.o C WsHOi!ڋy/QjO̶ϜV}JqqbK8s꥖y&X0Z5!)dlndr8ZˆY[Y:>}G6v T?\ !GUۄLDNnQJc~H>cuKF|?#)N/܀;sN̂OFUЂ PBrGW\yjGz:=Sk J'QOkB:OB0p/!_}=`>Mצ} ,((0.).u[h6s^W > }~2OFBk%zβ?f/Hӕ1dMԏ/jɜC>r@"Hמ,M :LP27 4_&| GYe"Bcyo$f}R93fΪ`4$2<(%@mYDM!RTZ> i < W[]؀Q59ͳMo ȭbvbKIj|5k \F@sMX4j߇si`#ܵaNϜLVH00fKgy6?^uhSoZhф`u{K(I0D#N'wS}JܰO'R-/d0._|~=|;ivdXᾐ2YrN))EY7Ջl͜'_޽?(V &ʔ\^Lټo/;<^?i `kb#5ߚM&fj$|ĸ['Co^k姾+'`2'9rr<WYmD+ؼbZ}T)i['iۺYj[CXL{X' d [&ЦmP<۫c[n].CoY6FdB#FL#3m@ˍ55wfMЙlb kL:/9xN5ԓE ybuǽ07.?>&Cf٢92y;([vνoԲhQT/ 'ԫJe/Au(+$7GhP#>L-u!kXw A"Sz8d☭V.}9^m~?Jp&񠌬쵡/>ޏB0`Apc7|XZ/=VZpszmky9o%FbRNLd ,/oEJPtzIk[I U%O˩,Ĵ߅ WGQ{5UOZTDwIg":E;@$QAn`͉p8DȕkN.#hx0YWs5WbK)x^9 2:D^~ѴSiFɯdSKe&N:Nu6j$H˸^mo=0Ɇxm]uE< 1ܲuS&ө@{tsڬ'AC>j[{¢xmݍc֣վah9_py8 AH!LmDIƤ<̄hӚ6)MgCIjI!2{,sIfYjF[O ӜTl.Ń=&ˋ&|fcpnVd{INz1Uoz}sr7k3#7=O f]*Na їo_q*Y, ` I v]aIUdozT[օ֫9,A-k? ѕՈo\ŹEiKrcXdrvڱԏOkz,`@{?z>PLrfXHg1Vv^Kmt]:]]`?q.d鰿]9; 7M$TZ H /J'ٖʡd##HpoӠJ;/T ''$VXyS<}>bVa"i🶮7O86mtRUa޸"DKQ E|L)(/;>R` qS2}9B@α'?1Yr*H!Mg6B:M y:qJWi-6E_aOx=hS8#lahSΓ_nOeF({hLEnct3K\)xRd-d(+  v npnt5*TŴ02XPrYɂ,)i!JV5ɾ=(=-JngO/]k"0*rP`l9m*2\*uc@= c`*R{|1P*HeĠSH xTV6Nb%8w"3e z('mZac2דּ2xòZ$(ѢG^ZÍ 6)-ů*(ͱTDzrFQ(" &F 4&kLj@.DOO^ajIIz+E*Ղٳ=mW+IvKUJ1]5soRX\ξ<.qՙe^(EUKqϓt۪ 7M_i'޳q$WYbGU? S ˂~Y%K!)bUz )J I1 ؔHԫUEHVl67_c_Zy|.+pd4;i.ӧ>XØVR~i5['"-(5y`󗆒iX;J#Q?gt(acGVE(YMe^8NK4oktLfD{w cB̓m "VqjGt X#G:p(%@q$PW0#XSzpv|]FkgsQsy c6-XNhȅu.1u< X-vFW15B&ģ(SnQ1MN(QJA}f8Qԥ+P[m΃pbWs}md䒑!Oҧ ߼)))2~? rz#ou7W-A v UE?ʍvt^p&,n//>٪7 TIwC`B.YE4yߖL˔`)"rg1v[\% ~szNB3 >P<bgլ?r7ڕ2G צwSHw>Iu3jty=)O^4 ERYT䵁[\#R(5R(;Bh!ÿ|bl}«CO<12EsN| F P qwwӄVzS`'9P B(z @p~czh)?tvFk qIk+O:6A5^+d(<(cEDg &kB g*Z\=:W#ѬMHI_[[y>OFͳI =|Ãlbՙj9$?79z{zls$+$n$i X},\n lNr@i&lN&O|=>\;d M:qڍutى8րhi>O;bӮ$k,ۊ]ґ:PH@ᥳ|ju%8T N1^s].o϶z#)q)z0ϧWR}51"!$(żfNΧӛy$tS3^XM/Ez F?iq(QEeoޞir~,R,d<]J% …<e Rtv귍}S/.X~$wvۍ jV8}*Pjͭd{POՅr{zrLCyq5)N9_b9\\>dMG~E/>ad۲x"a-J_(OS9E0h'+=`S }\xM~rL=5@YVOFU?@Yv˽le'cʓD:ϓSpn?aB m'扆E{J_X9\X[X5WՊdQ`G: Pp'lR{DiUF%2 <9 BJF U `^hzQ?1o0q$&4*fe ZCiwlm*2C/"*oe*-vp/kJD6ݞ Cnj2cӄkIJHl$S8W,˹Q=H/b8NB''yIk" ȅd#TDlsk TF ,ZM }f'΁Dޔ.U 09߭B-90lUE|hmۧr9R㌠RU)j:FSR! ("ŤE1s6i{-$9H ] Gj\d G™vc3,ܫ\] LӬ5k53֔630hdSH9~}5!;FiAm4(lf?~<3㠭S50N0 ) tR}Q@QNZwWz{[" `%80xnj*}[qj9QÌ̬uS0m }ʍ"Q8Ϝ{%WdTFfMҤt^3nPmCF$Ǣl<Á5b/5a]&Vk( y]|=`e,0b_N{n2\_-3}5p+7 F#!Hq!Q"J&!җR;K ia,4'NxyZ[Y7e֘6F_lr_ȫҁ0U̙Ws 4mbBk:LB &(x$r=pB /-/DGYiYwE}$6Qd8HcN2[ .s(Ә2 gů!4"R4k  .BJBK#^/!"aO.(1Ǒ5hAQG BkQ_ƠjL{Agz ؘ<-"rY$2Y5D%EafBuLg!4(s1 Bk_/!(ܰ#H%bd5aH|*g&-Jy>OՆLfaFnKDB'$,YwX"4]WQ`a_+4oyKfC`kkVq~=٪oHb^]^|j!M$k7eL&AVG@jxj-_f^iRkLK㮱Nzw ^ hWVC߹P/r0&w[[Ȝw`%/эN<PDKy417Qk8]R_WԷxִe 7`4|`$g>ECNֱm5'U&HQ6Z4ƾldUP-ͫ@U I|}Z ʽZjuj{m'y}w:H#V_H{}Wֺÿc!Gϕ@H%Ԋ+о["CPUw1*!8i6sf>8= PN$͑RȽOF foF+/bkGɕ;h*_>-NFy֥խpfI̙LE xȑ'Ɏwlx %$jɨGm#%-OF=f#ހ5n0 HDTSzW^^!5΢6ңPT x=p)ZhRBYIG92VHq&5X'U6 ]w5A+3QW3NvulxԵu됥Q Wꩁdg|X#`$Ssrz#-}u67Bz@ F2^6 Z~I j^e*Cx;m/ߕ$̖GNT&&\Y}{4LARѵ-s1L&v{<1v~L. '{42&*~KҊ7}}\zD]f;}rϙJX~@P4{5j5MvgԪv[)>cg Ɂ5s [3dNLwsbO/rb"'&Ԝ9=ĴȞˉXDvbЪJa({TݷjZaw]{O)5W,QDLdG] ,pn>7n0qbUZ t=. 56 |t=m_e} @s)&sǣr }Ŗ`$ ,$L68G "3ElOVD0Nwv㋸@5ΣcG6+S0$G[P $1]&ՆI\ybr6[?SvC쉦 !!\Dd#Nn4[.)6њ h$j>$䕋hgH3 2 ?OJrvj7y]_*si%OM>emsXLXêBU({u3 p_ 1QvŕHcOuqm~7cOnQ{àkO./F9r  SSh~.9 hkվWz4+FO\,N=犨>{VvnH SYtd1B/;Y芠Α) vH֬4^p6J &VU_ibmkx[ :d ƪFqd|lFLٲNa rsL6KPl.b`I)lD0'KAd/!PG~ˀ@6Ǒ}8G͞5c3$[4 _K[`8r{&* hNil96e[nuUB ]#\, ϶9HwoQ-}LyIbԞw|+zۑ&hGݼP<Wч`ETX6o8^S_hOkSok {D .7ኡ8E''΁O1j_tzjIh c.*f:%6U6PaZV`TQ9* n_YhqY^0 f5CH ZORA+kKR*ZJ[]: VdYpK 5܂a`ԁfSL&JYTW3{/J<ʣ*'•+J݂aĉ+Wj CjT.FG\L}j,J@m"Bre`R(-3`h"ͦ= `B$֩>|cgiAMPyjmٿu %L9ȉog Oxfu7 NfiIAm>G AQ 6 rHt*v*P!z(-\1#uKAq |"rkwh-?Ldd A׎M`|{JMP7Q _>YgbAP*;ܮ1_̒Sj(U\V''9z}Y?5{wZ "%f3}+en %%OUlո%ݧÒUa[3E)L6YuIdLZ6ֹτb< Y\w1mYo?wgw7wo[4$‘nRNgYD7w ߇(^CލmL+,…hQ[Y(K@_mM}B}|gf\0If1%'[J |>e|sh6bS^ɡUhDƣn<`қ3x|:lT z}1&qg"fL?/CQ>?DДv̼vTҚvOJC`1nP% i,|sN47[ +`>H^3}hq)#8S

map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded Mar 08 00:06:50 crc kubenswrapper[4713]: body: Mar 08 00:06:50 crc kubenswrapper[4713]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:51.963634962 +0000 UTC m=+6.083267235,LastTimestamp:2026-03-08 00:05:51.963634962 +0000 UTC m=+6.083267235,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 08 00:06:50 crc kubenswrapper[4713]: > Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.611410 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ab4f1570413e6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:51.963714534 +0000 UTC m=+6.083346797,LastTimestamp:2026-03-08 00:05:51.963714534 +0000 UTC m=+6.083346797,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.618094 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 08 00:06:50 crc kubenswrapper[4713]: &Event{ObjectMeta:{kube-apiserver-crc.189ab4f33fb94478 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 08 00:06:50 crc kubenswrapper[4713]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 08 00:06:50 crc kubenswrapper[4713]: Mar 08 00:06:50 crc kubenswrapper[4713]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:06:00.162870392 +0000 UTC m=+14.282502645,LastTimestamp:2026-03-08 00:06:00.162870392 +0000 UTC m=+14.282502645,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 08 00:06:50 crc kubenswrapper[4713]: > Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.622352 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ab4f33fbadd6d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:06:00.162975085 +0000 UTC m=+14.282607358,LastTimestamp:2026-03-08 00:06:00.162975085 +0000 UTC m=+14.282607358,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.626666 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ab4f33fb94478\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 08 00:06:50 crc kubenswrapper[4713]: &Event{ObjectMeta:{kube-apiserver-crc.189ab4f33fb94478 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 08 00:06:50 crc kubenswrapper[4713]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 08 00:06:50 crc kubenswrapper[4713]: Mar 08 00:06:50 crc kubenswrapper[4713]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:06:00.162870392 +0000 UTC m=+14.282502645,LastTimestamp:2026-03-08 00:06:00.170192917 +0000 UTC m=+14.289825150,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 08 00:06:50 crc kubenswrapper[4713]: > Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.630960 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ab4f33fbadd6d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ab4f33fbadd6d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:06:00.162975085 +0000 UTC m=+14.282607358,LastTimestamp:2026-03-08 00:06:00.170236628 +0000 UTC m=+14.289868861,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.634855 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ab4f0badca476\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ab4f0badca476 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:49.343884406 +0000 UTC m=+3.463516659,LastTimestamp:2026-03-08 00:06:00.632333569 +0000 UTC m=+14.751965802,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.638578 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ab4f0c4f404b1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ab4f0c4f404b1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:49.513188529 +0000 UTC m=+3.632820762,LastTimestamp:2026-03-08 00:06:00.850594661 +0000 UTC m=+14.970226894,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.644067 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ab4f0c57903e1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ab4f0c57903e1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:49.521904609 +0000 UTC m=+3.641536842,LastTimestamp:2026-03-08 00:06:00.862786469 +0000 UTC m=+14.982418702,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.650057 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 08 00:06:50 crc kubenswrapper[4713]: &Event{ObjectMeta:{kube-controller-manager-crc.189ab4f3ab17f343 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 08 00:06:50 crc kubenswrapper[4713]: body: Mar 08 00:06:50 crc kubenswrapper[4713]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:06:01.964237635 +0000 UTC m=+16.083869898,LastTimestamp:2026-03-08 00:06:01.964237635 +0000 UTC m=+16.083869898,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 08 00:06:50 crc kubenswrapper[4713]: > Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.654179 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ab4f3ab193737 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:06:01.964320567 +0000 UTC m=+16.083952840,LastTimestamp:2026-03-08 00:06:01.964320567 +0000 UTC m=+16.083952840,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.660039 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ab4f3ab17f343\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 08 00:06:50 crc kubenswrapper[4713]: &Event{ObjectMeta:{kube-controller-manager-crc.189ab4f3ab17f343 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 08 00:06:50 crc kubenswrapper[4713]: body: Mar 08 00:06:50 crc kubenswrapper[4713]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:06:01.964237635 +0000 UTC m=+16.083869898,LastTimestamp:2026-03-08 00:06:11.963604876 +0000 UTC m=+26.083237119,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 08 00:06:50 crc kubenswrapper[4713]: > Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.665314 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ab4f3ab193737\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ab4f3ab193737 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:06:01.964320567 +0000 UTC m=+16.083952840,LastTimestamp:2026-03-08 00:06:11.963655787 +0000 UTC m=+26.083288030,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.670164 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ab4f5ff4971dc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:06:11.966702044 +0000 UTC m=+26.086334377,LastTimestamp:2026-03-08 00:06:11.966702044 +0000 UTC m=+26.086334377,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.683422 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ab4f054ee2795\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ab4f054ee2795 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:47.633756053 +0000 UTC m=+1.753388316,LastTimestamp:2026-03-08 00:06:12.084455383 +0000 UTC m=+26.204087626,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.688537 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ab4f0695e87dd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ab4f0695e87dd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:47.976665053 +0000 UTC m=+2.096297326,LastTimestamp:2026-03-08 00:06:12.22150699 +0000 UTC m=+26.341139213,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.693848 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ab4f06a985b21\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ab4f06a985b21 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:47.997231905 +0000 UTC m=+2.116864148,LastTimestamp:2026-03-08 00:06:12.239866785 +0000 UTC m=+26.359499058,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.700436 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ab4f3ab17f343\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 08 00:06:50 crc kubenswrapper[4713]: &Event{ObjectMeta:{kube-controller-manager-crc.189ab4f3ab17f343 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 08 00:06:50 crc kubenswrapper[4713]: body: Mar 08 00:06:50 crc kubenswrapper[4713]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:06:01.964237635 +0000 UTC m=+16.083869898,LastTimestamp:2026-03-08 00:06:21.964333619 +0000 UTC m=+36.083965872,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 08 00:06:50 crc kubenswrapper[4713]: > Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.705866 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ab4f3ab193737\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ab4f3ab193737 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:06:01.964320567 +0000 UTC m=+16.083952840,LastTimestamp:2026-03-08 00:06:21.96438592 +0000 UTC m=+36.084018163,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.711131 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ab4f3ab17f343\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 08 00:06:50 crc kubenswrapper[4713]: &Event{ObjectMeta:{kube-controller-manager-crc.189ab4f3ab17f343 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 08 00:06:50 crc kubenswrapper[4713]: body: Mar 08 00:06:50 crc kubenswrapper[4713]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:06:01.964237635 +0000 UTC m=+16.083869898,LastTimestamp:2026-03-08 00:06:31.964216751 +0000 UTC m=+46.083849014,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 08 00:06:50 crc kubenswrapper[4713]: > Mar 08 00:06:51 crc kubenswrapper[4713]: I0308 00:06:51.481552 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:06:51 crc kubenswrapper[4713]: I0308 00:06:51.964359 4713 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 00:06:51 crc kubenswrapper[4713]: I0308 00:06:51.964408 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 00:06:52 crc kubenswrapper[4713]: I0308 00:06:52.482349 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:06:52 crc kubenswrapper[4713]: I0308 00:06:52.540753 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:52 crc kubenswrapper[4713]: I0308 00:06:52.542256 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:52 crc kubenswrapper[4713]: I0308 00:06:52.542312 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:52 crc kubenswrapper[4713]: I0308 00:06:52.542330 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:52 crc kubenswrapper[4713]: I0308 00:06:52.543227 4713 scope.go:117] "RemoveContainer" containerID="ba0fec7e634640b5dace3848ee394f9c875b4ca833f93363a128e2304ef8d418" Mar 08 00:06:52 crc kubenswrapper[4713]: I0308 00:06:52.802615 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 08 00:06:52 crc kubenswrapper[4713]: I0308 00:06:52.804723 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707"} Mar 08 00:06:52 crc kubenswrapper[4713]: I0308 00:06:52.805010 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:52 crc kubenswrapper[4713]: I0308 00:06:52.806021 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:52 crc kubenswrapper[4713]: I0308 00:06:52.806055 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:52 crc kubenswrapper[4713]: I0308 00:06:52.806066 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:53 crc kubenswrapper[4713]: I0308 00:06:53.481770 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:06:53 crc kubenswrapper[4713]: I0308 00:06:53.552959 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:06:53 crc kubenswrapper[4713]: I0308 00:06:53.809378 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 08 00:06:53 crc kubenswrapper[4713]: I0308 00:06:53.810133 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 08 00:06:53 crc kubenswrapper[4713]: I0308 00:06:53.812538 4713 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707" exitCode=255 Mar 08 00:06:53 crc kubenswrapper[4713]: I0308 00:06:53.812580 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707"} Mar 08 00:06:53 crc kubenswrapper[4713]: I0308 00:06:53.812615 4713 scope.go:117] "RemoveContainer" containerID="ba0fec7e634640b5dace3848ee394f9c875b4ca833f93363a128e2304ef8d418" Mar 08 00:06:53 crc kubenswrapper[4713]: I0308 00:06:53.812635 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:53 crc kubenswrapper[4713]: I0308 00:06:53.813568 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:53 crc kubenswrapper[4713]: I0308 00:06:53.813593 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:53 crc kubenswrapper[4713]: I0308 00:06:53.813608 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:53 crc kubenswrapper[4713]: I0308 00:06:53.814130 4713 scope.go:117] "RemoveContainer" containerID="5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707" Mar 08 00:06:53 crc kubenswrapper[4713]: E0308 00:06:53.814302 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 00:06:54 crc kubenswrapper[4713]: I0308 00:06:54.290096 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:06:54 crc kubenswrapper[4713]: I0308 00:06:54.481309 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:06:54 crc kubenswrapper[4713]: I0308 00:06:54.816003 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 08 00:06:54 crc kubenswrapper[4713]: I0308 00:06:54.818094 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:54 crc kubenswrapper[4713]: I0308 00:06:54.818844 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:54 crc kubenswrapper[4713]: I0308 00:06:54.818980 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:54 crc kubenswrapper[4713]: I0308 00:06:54.819072 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:54 crc kubenswrapper[4713]: I0308 00:06:54.819714 4713 scope.go:117] "RemoveContainer" containerID="5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707" Mar 08 00:06:54 crc kubenswrapper[4713]: E0308 00:06:54.820047 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 00:06:55 crc kubenswrapper[4713]: I0308 00:06:55.480427 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:06:55 crc kubenswrapper[4713]: E0308 00:06:55.606160 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 08 00:06:55 crc kubenswrapper[4713]: I0308 00:06:55.615311 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:55 crc kubenswrapper[4713]: I0308 00:06:55.616450 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:55 crc kubenswrapper[4713]: I0308 00:06:55.616484 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:55 crc kubenswrapper[4713]: I0308 00:06:55.616496 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:55 crc kubenswrapper[4713]: I0308 00:06:55.616526 4713 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 00:06:55 crc kubenswrapper[4713]: E0308 00:06:55.621920 4713 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 08 00:06:55 crc kubenswrapper[4713]: I0308 00:06:55.819620 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:55 crc kubenswrapper[4713]: I0308 00:06:55.820719 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:55 crc kubenswrapper[4713]: I0308 00:06:55.820802 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:55 crc kubenswrapper[4713]: I0308 00:06:55.820882 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:55 crc kubenswrapper[4713]: I0308 00:06:55.821325 4713 scope.go:117] "RemoveContainer" containerID="5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707" Mar 08 00:06:55 crc kubenswrapper[4713]: E0308 00:06:55.821511 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 00:06:56 crc kubenswrapper[4713]: I0308 00:06:56.431761 4713 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 08 00:06:56 crc kubenswrapper[4713]: I0308 00:06:56.447464 4713 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 08 00:06:56 crc kubenswrapper[4713]: I0308 00:06:56.483587 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:06:56 crc kubenswrapper[4713]: E0308 00:06:56.619918 4713 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 00:06:57 crc kubenswrapper[4713]: I0308 00:06:57.481127 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:06:58 crc kubenswrapper[4713]: I0308 00:06:58.482701 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:06:58 crc kubenswrapper[4713]: I0308 00:06:58.969531 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:06:58 crc kubenswrapper[4713]: I0308 00:06:58.969702 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:58 crc kubenswrapper[4713]: I0308 00:06:58.970738 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:58 crc kubenswrapper[4713]: I0308 00:06:58.970778 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:58 crc kubenswrapper[4713]: I0308 00:06:58.970787 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:58 crc kubenswrapper[4713]: I0308 00:06:58.975590 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:06:59 crc kubenswrapper[4713]: I0308 00:06:59.480749 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:06:59 crc kubenswrapper[4713]: I0308 00:06:59.829439 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:59 crc kubenswrapper[4713]: I0308 00:06:59.830308 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:59 crc kubenswrapper[4713]: I0308 00:06:59.830436 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:59 crc kubenswrapper[4713]: I0308 00:06:59.830526 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:00 crc kubenswrapper[4713]: I0308 00:07:00.482330 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:07:01 crc kubenswrapper[4713]: I0308 00:07:01.327612 4713 csr.go:261] certificate signing request csr-bj8qx is approved, waiting to be issued Mar 08 00:07:01 crc kubenswrapper[4713]: I0308 00:07:01.334665 4713 csr.go:257] certificate signing request csr-bj8qx is issued Mar 08 00:07:01 crc kubenswrapper[4713]: I0308 00:07:01.388837 4713 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.314808 4713 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.336222 4713 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-21 22:24:43.889191823 +0000 UTC Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.336379 4713 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6214h17m41.552820944s for next certificate rotation Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.622685 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.623708 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.623745 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.623755 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.623860 4713 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.632412 4713 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.632847 4713 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 08 00:07:02 crc kubenswrapper[4713]: E0308 00:07:02.632880 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.636375 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.636421 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.636431 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.636449 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.636461 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:02Z","lastTransitionTime":"2026-03-08T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:02 crc kubenswrapper[4713]: E0308 00:07:02.647941 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.654852 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.654884 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.654895 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.654911 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.654922 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:02Z","lastTransitionTime":"2026-03-08T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:02 crc kubenswrapper[4713]: E0308 00:07:02.663361 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.669923 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.669955 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.669964 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.669977 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.669987 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:02Z","lastTransitionTime":"2026-03-08T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:02 crc kubenswrapper[4713]: E0308 00:07:02.682043 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.689288 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.689362 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.689376 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.689400 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.689438 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:02Z","lastTransitionTime":"2026-03-08T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:02 crc kubenswrapper[4713]: E0308 00:07:02.699709 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:02 crc kubenswrapper[4713]: E0308 00:07:02.699841 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 00:07:02 crc kubenswrapper[4713]: E0308 00:07:02.699866 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:02 crc kubenswrapper[4713]: E0308 00:07:02.800264 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:02 crc kubenswrapper[4713]: E0308 00:07:02.901377 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:03 crc kubenswrapper[4713]: E0308 00:07:03.002239 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:03 crc kubenswrapper[4713]: E0308 00:07:03.102973 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:03 crc kubenswrapper[4713]: E0308 00:07:03.203722 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:03 crc kubenswrapper[4713]: E0308 00:07:03.304444 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:03 crc kubenswrapper[4713]: E0308 00:07:03.405371 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:03 crc kubenswrapper[4713]: E0308 00:07:03.505504 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:03 crc kubenswrapper[4713]: E0308 00:07:03.605872 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:03 crc kubenswrapper[4713]: E0308 00:07:03.706623 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:03 crc kubenswrapper[4713]: E0308 00:07:03.807259 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:03 crc kubenswrapper[4713]: E0308 00:07:03.907654 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:04 crc kubenswrapper[4713]: E0308 00:07:04.008717 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:04 crc kubenswrapper[4713]: E0308 00:07:04.109461 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:04 crc kubenswrapper[4713]: E0308 00:07:04.210132 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:04 crc kubenswrapper[4713]: E0308 00:07:04.310999 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:04 crc kubenswrapper[4713]: E0308 00:07:04.411966 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:04 crc kubenswrapper[4713]: E0308 00:07:04.513076 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:04 crc kubenswrapper[4713]: E0308 00:07:04.614125 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:04 crc kubenswrapper[4713]: E0308 00:07:04.714990 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:04 crc kubenswrapper[4713]: E0308 00:07:04.815554 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:04 crc kubenswrapper[4713]: E0308 00:07:04.916594 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:05 crc kubenswrapper[4713]: E0308 00:07:05.017495 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:05 crc kubenswrapper[4713]: E0308 00:07:05.118011 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:05 crc kubenswrapper[4713]: E0308 00:07:05.218910 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:05 crc kubenswrapper[4713]: E0308 00:07:05.319962 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:05 crc kubenswrapper[4713]: E0308 00:07:05.421110 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:05 crc kubenswrapper[4713]: E0308 00:07:05.521725 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:05 crc kubenswrapper[4713]: E0308 00:07:05.622406 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:05 crc kubenswrapper[4713]: E0308 00:07:05.723100 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:05 crc kubenswrapper[4713]: E0308 00:07:05.823478 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:05 crc kubenswrapper[4713]: E0308 00:07:05.923858 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:06 crc kubenswrapper[4713]: E0308 00:07:06.024920 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:06 crc kubenswrapper[4713]: E0308 00:07:06.125706 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:06 crc kubenswrapper[4713]: E0308 00:07:06.225893 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:06 crc kubenswrapper[4713]: E0308 00:07:06.326197 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:06 crc kubenswrapper[4713]: E0308 00:07:06.426536 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:06 crc kubenswrapper[4713]: E0308 00:07:06.527090 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:06 crc kubenswrapper[4713]: E0308 00:07:06.620456 4713 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 00:07:06 crc kubenswrapper[4713]: E0308 00:07:06.627315 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:06 crc kubenswrapper[4713]: E0308 00:07:06.728036 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:06 crc kubenswrapper[4713]: E0308 00:07:06.828856 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:06 crc kubenswrapper[4713]: E0308 00:07:06.929575 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:07 crc kubenswrapper[4713]: E0308 00:07:07.030157 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:07 crc kubenswrapper[4713]: E0308 00:07:07.130262 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:07 crc kubenswrapper[4713]: E0308 00:07:07.231442 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:07 crc kubenswrapper[4713]: E0308 00:07:07.332444 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:07 crc kubenswrapper[4713]: E0308 00:07:07.432571 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:07 crc kubenswrapper[4713]: E0308 00:07:07.533461 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:07 crc kubenswrapper[4713]: E0308 00:07:07.634496 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:07 crc kubenswrapper[4713]: E0308 00:07:07.735241 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:07 crc kubenswrapper[4713]: E0308 00:07:07.836100 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:07 crc kubenswrapper[4713]: E0308 00:07:07.937277 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:08 crc kubenswrapper[4713]: E0308 00:07:08.038388 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:08 crc kubenswrapper[4713]: E0308 00:07:08.139438 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:08 crc kubenswrapper[4713]: E0308 00:07:08.239632 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:08 crc kubenswrapper[4713]: E0308 00:07:08.340859 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:08 crc kubenswrapper[4713]: E0308 00:07:08.441004 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:08 crc kubenswrapper[4713]: E0308 00:07:08.541952 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:08 crc kubenswrapper[4713]: E0308 00:07:08.642934 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:08 crc kubenswrapper[4713]: E0308 00:07:08.743928 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:08 crc kubenswrapper[4713]: E0308 00:07:08.845086 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:08 crc kubenswrapper[4713]: E0308 00:07:08.945700 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:09 crc kubenswrapper[4713]: E0308 00:07:09.046656 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:09 crc kubenswrapper[4713]: E0308 00:07:09.147346 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:09 crc kubenswrapper[4713]: E0308 00:07:09.248256 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:09 crc kubenswrapper[4713]: E0308 00:07:09.349435 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:09 crc kubenswrapper[4713]: E0308 00:07:09.450655 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:09 crc kubenswrapper[4713]: I0308 00:07:09.540743 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:07:09 crc kubenswrapper[4713]: I0308 00:07:09.541960 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:09 crc kubenswrapper[4713]: I0308 00:07:09.542015 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:09 crc kubenswrapper[4713]: I0308 00:07:09.542033 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:09 crc kubenswrapper[4713]: I0308 00:07:09.542999 4713 scope.go:117] "RemoveContainer" containerID="5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707" Mar 08 00:07:09 crc kubenswrapper[4713]: E0308 00:07:09.543279 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 00:07:09 crc kubenswrapper[4713]: E0308 00:07:09.550974 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:09 crc kubenswrapper[4713]: E0308 00:07:09.651795 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:09 crc kubenswrapper[4713]: E0308 00:07:09.751933 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:09 crc kubenswrapper[4713]: E0308 00:07:09.853009 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:09 crc kubenswrapper[4713]: E0308 00:07:09.953752 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:10 crc kubenswrapper[4713]: E0308 00:07:10.053925 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:10 crc kubenswrapper[4713]: E0308 00:07:10.154749 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:10 crc kubenswrapper[4713]: E0308 00:07:10.255919 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:10 crc kubenswrapper[4713]: E0308 00:07:10.356845 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:10 crc kubenswrapper[4713]: I0308 00:07:10.408666 4713 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 08 00:07:10 crc kubenswrapper[4713]: E0308 00:07:10.457912 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:10 crc kubenswrapper[4713]: E0308 00:07:10.558912 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:10 crc kubenswrapper[4713]: E0308 00:07:10.659990 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:10 crc kubenswrapper[4713]: E0308 00:07:10.760446 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:10 crc kubenswrapper[4713]: E0308 00:07:10.860933 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:10 crc kubenswrapper[4713]: E0308 00:07:10.961103 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:11 crc kubenswrapper[4713]: E0308 00:07:11.062307 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:11 crc kubenswrapper[4713]: E0308 00:07:11.163417 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:11 crc kubenswrapper[4713]: E0308 00:07:11.264392 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:11 crc kubenswrapper[4713]: E0308 00:07:11.364790 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:11 crc kubenswrapper[4713]: E0308 00:07:11.464971 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:11 crc kubenswrapper[4713]: E0308 00:07:11.565513 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:11 crc kubenswrapper[4713]: E0308 00:07:11.665623 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:11 crc kubenswrapper[4713]: E0308 00:07:11.765989 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:11 crc kubenswrapper[4713]: E0308 00:07:11.867067 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:11 crc kubenswrapper[4713]: E0308 00:07:11.967621 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:12 crc kubenswrapper[4713]: E0308 00:07:12.068428 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:12 crc kubenswrapper[4713]: E0308 00:07:12.169236 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:12 crc kubenswrapper[4713]: E0308 00:07:12.270232 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:12 crc kubenswrapper[4713]: E0308 00:07:12.370450 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:12 crc kubenswrapper[4713]: E0308 00:07:12.471479 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:12 crc kubenswrapper[4713]: E0308 00:07:12.572547 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:12 crc kubenswrapper[4713]: E0308 00:07:12.672997 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:12 crc kubenswrapper[4713]: E0308 00:07:12.773601 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:12 crc kubenswrapper[4713]: E0308 00:07:12.874144 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:12 crc kubenswrapper[4713]: E0308 00:07:12.975060 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:12 crc kubenswrapper[4713]: E0308 00:07:12.993610 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 08 00:07:12 crc kubenswrapper[4713]: I0308 00:07:12.997740 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:12 crc kubenswrapper[4713]: I0308 00:07:12.997790 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:12 crc kubenswrapper[4713]: I0308 00:07:12.997807 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:12 crc kubenswrapper[4713]: I0308 00:07:12.997860 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:12 crc kubenswrapper[4713]: I0308 00:07:12.997905 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:12Z","lastTransitionTime":"2026-03-08T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:13 crc kubenswrapper[4713]: E0308 00:07:13.009568 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.014639 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.014753 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.014776 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.014799 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.014817 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:13Z","lastTransitionTime":"2026-03-08T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:13 crc kubenswrapper[4713]: E0308 00:07:13.031696 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.036861 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.036961 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.036980 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.037007 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.037059 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:13Z","lastTransitionTime":"2026-03-08T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:13 crc kubenswrapper[4713]: E0308 00:07:13.055348 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.060715 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.060776 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.060795 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.060846 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.060872 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:13Z","lastTransitionTime":"2026-03-08T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:13 crc kubenswrapper[4713]: E0308 00:07:13.077081 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:13 crc kubenswrapper[4713]: E0308 00:07:13.077304 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 00:07:13 crc kubenswrapper[4713]: E0308 00:07:13.077343 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:13 crc kubenswrapper[4713]: E0308 00:07:13.177982 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:13 crc kubenswrapper[4713]: E0308 00:07:13.278901 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:13 crc kubenswrapper[4713]: E0308 00:07:13.379757 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:13 crc kubenswrapper[4713]: E0308 00:07:13.480531 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.540484 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.542079 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.542112 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.542124 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:13 crc kubenswrapper[4713]: E0308 00:07:13.581573 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:13 crc kubenswrapper[4713]: E0308 00:07:13.682674 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:13 crc kubenswrapper[4713]: E0308 00:07:13.782873 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:13 crc kubenswrapper[4713]: E0308 00:07:13.883025 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:13 crc kubenswrapper[4713]: E0308 00:07:13.984061 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:14 crc kubenswrapper[4713]: E0308 00:07:14.084888 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:14 crc kubenswrapper[4713]: E0308 00:07:14.185998 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:14 crc kubenswrapper[4713]: E0308 00:07:14.287056 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:14 crc kubenswrapper[4713]: E0308 00:07:14.387343 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:14 crc kubenswrapper[4713]: E0308 00:07:14.487501 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:14 crc kubenswrapper[4713]: E0308 00:07:14.588281 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:14 crc kubenswrapper[4713]: E0308 00:07:14.688605 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:14 crc kubenswrapper[4713]: E0308 00:07:14.788714 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:14 crc kubenswrapper[4713]: E0308 00:07:14.889729 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:14 crc kubenswrapper[4713]: E0308 00:07:14.990752 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:15 crc kubenswrapper[4713]: E0308 00:07:15.091416 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:15 crc kubenswrapper[4713]: E0308 00:07:15.192135 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:15 crc kubenswrapper[4713]: E0308 00:07:15.293222 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:15 crc kubenswrapper[4713]: E0308 00:07:15.394483 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:15 crc kubenswrapper[4713]: E0308 00:07:15.495163 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:15 crc kubenswrapper[4713]: E0308 00:07:15.596050 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:15 crc kubenswrapper[4713]: E0308 00:07:15.696463 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:15 crc kubenswrapper[4713]: E0308 00:07:15.797223 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:15 crc kubenswrapper[4713]: E0308 00:07:15.898333 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:15 crc kubenswrapper[4713]: E0308 00:07:15.998655 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:16 crc kubenswrapper[4713]: I0308 00:07:16.009437 4713 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 08 00:07:16 crc kubenswrapper[4713]: E0308 00:07:16.099373 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:16 crc kubenswrapper[4713]: E0308 00:07:16.200234 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:16 crc kubenswrapper[4713]: E0308 00:07:16.301099 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:16 crc kubenswrapper[4713]: E0308 00:07:16.401968 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:16 crc kubenswrapper[4713]: E0308 00:07:16.502900 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:16 crc kubenswrapper[4713]: E0308 00:07:16.603907 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:16 crc kubenswrapper[4713]: E0308 00:07:16.621180 4713 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 00:07:16 crc kubenswrapper[4713]: E0308 00:07:16.704438 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:16 crc kubenswrapper[4713]: E0308 00:07:16.805847 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:16 crc kubenswrapper[4713]: E0308 00:07:16.906673 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:17 crc kubenswrapper[4713]: E0308 00:07:17.007162 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:17 crc kubenswrapper[4713]: E0308 00:07:17.107522 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:17 crc kubenswrapper[4713]: E0308 00:07:17.208481 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:17 crc kubenswrapper[4713]: E0308 00:07:17.309611 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:17 crc kubenswrapper[4713]: E0308 00:07:17.409718 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:17 crc kubenswrapper[4713]: E0308 00:07:17.510945 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:17 crc kubenswrapper[4713]: E0308 00:07:17.611914 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:17 crc kubenswrapper[4713]: E0308 00:07:17.712624 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:17 crc kubenswrapper[4713]: E0308 00:07:17.813403 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:17 crc kubenswrapper[4713]: E0308 00:07:17.914372 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:18 crc kubenswrapper[4713]: E0308 00:07:18.015317 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:18 crc kubenswrapper[4713]: E0308 00:07:18.116132 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:18 crc kubenswrapper[4713]: E0308 00:07:18.216933 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:18 crc kubenswrapper[4713]: E0308 00:07:18.317096 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:18 crc kubenswrapper[4713]: E0308 00:07:18.417706 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:18 crc kubenswrapper[4713]: E0308 00:07:18.518877 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:18 crc kubenswrapper[4713]: E0308 00:07:18.619789 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:18 crc kubenswrapper[4713]: E0308 00:07:18.719932 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:18 crc kubenswrapper[4713]: E0308 00:07:18.820195 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:18 crc kubenswrapper[4713]: E0308 00:07:18.920716 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.021593 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.122675 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.223711 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.305124 4713 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.326038 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.326061 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.326068 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.326080 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.326089 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:19Z","lastTransitionTime":"2026-03-08T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.428844 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.429207 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.429334 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.429449 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.429572 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:19Z","lastTransitionTime":"2026-03-08T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.506869 4713 apiserver.go:52] "Watching apiserver" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.517604 4713 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.518008 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.518637 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.518897 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.519328 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.519217 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.520017 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.520605 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.521242 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.521540 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.521583 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.521726 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.521794 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.521726 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.523759 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.523947 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.524144 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.524275 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.524741 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.526118 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.531809 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.532008 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.532121 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.532887 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.532908 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:19Z","lastTransitionTime":"2026-03-08T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.543928 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.555243 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.569617 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.580016 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.583758 4713 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.589817 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.599309 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.611965 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.612113 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.612507 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.612714 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.613473 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.614895 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.615741 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.616013 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.616417 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.616754 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.617015 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.613386 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.613795 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.617343 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.613920 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.614815 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.616085 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.616274 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.616662 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.617141 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.617161 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.617807 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.617242 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.618043 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.618362 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.618657 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.618802 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.619059 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.619200 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.619589 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.618716 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.618990 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.619764 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.619857 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.619892 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620027 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620136 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620163 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620185 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620207 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620231 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620257 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620281 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620307 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620326 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620345 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620363 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620383 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620403 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620423 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620604 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620624 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620644 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620664 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620686 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620706 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620727 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620746 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620766 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620796 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620848 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620871 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620891 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620913 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620935 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620955 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620973 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620994 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621018 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621037 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621058 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621077 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621099 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621202 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621228 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621249 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621269 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621291 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621315 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621334 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621355 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621374 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621394 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621843 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621919 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621958 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.622525 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.622562 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.622573 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.622652 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.622666 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:07:20.122643693 +0000 UTC m=+94.242276016 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.622698 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.622801 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621413 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.623014 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.623037 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.623058 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.623078 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.623108 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.623183 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.623339 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.623678 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.623890 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.623899 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.624007 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.624016 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.624177 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.624269 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.624273 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.624418 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.624548 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.624560 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.624621 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.624648 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.624774 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.624863 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.624945 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.625226 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.625246 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.625241 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.625518 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.625623 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.625637 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.625656 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.625704 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.625372 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.625526 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626061 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626146 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626278 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626458 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626489 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626511 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626531 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626551 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626570 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626591 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626611 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626631 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626653 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626675 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626695 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626717 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626737 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626759 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626781 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626810 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626864 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626888 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626908 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626930 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626950 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626973 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626995 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627019 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627040 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627059 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627079 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627101 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627122 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627144 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627163 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627192 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627212 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627233 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627254 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627276 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626023 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626092 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626494 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626597 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626641 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626644 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626864 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626930 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626944 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626945 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626962 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627256 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627270 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627289 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627291 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627300 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627506 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627561 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627584 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627609 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627618 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627634 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627663 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627689 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627713 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627735 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627760 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627782 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627809 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627851 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627875 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627898 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627923 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627948 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627971 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627992 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628016 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628042 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628065 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628092 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628114 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628138 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628161 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628180 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628200 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628221 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628242 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628265 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628286 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628307 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628327 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628348 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628370 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628390 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628409 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628431 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628451 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628473 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628497 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628519 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628541 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628565 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628588 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628610 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628632 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628652 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628671 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628693 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628715 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628743 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628765 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628788 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628807 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628877 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628901 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628923 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628948 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628970 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628994 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629019 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629041 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629065 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629085 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629106 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629128 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629153 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629177 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629230 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629255 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629276 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629301 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629323 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629359 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629385 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629409 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629434 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629456 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629479 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629503 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629549 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629575 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629597 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629621 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629646 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629671 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629700 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629728 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629750 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629772 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629804 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629865 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629887 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629912 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630882 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630899 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630913 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630926 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630940 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630954 4713 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630968 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630981 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630994 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631007 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631019 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631031 4713 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631043 4713 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631055 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631123 4713 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631137 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631150 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631161 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631175 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631187 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631199 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631210 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631222 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631234 4713 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631244 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631763 4713 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631778 4713 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631791 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631803 4713 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631815 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631846 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631859 4713 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631870 4713 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631883 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631904 4713 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631916 4713 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631928 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631941 4713 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631954 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631966 4713 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631981 4713 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631993 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632005 4713 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632017 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632029 4713 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632042 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632053 4713 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632065 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632077 4713 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632088 4713 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632100 4713 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632112 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632122 4713 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632134 4713 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632146 4713 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632158 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632170 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632183 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632196 4713 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632208 4713 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632336 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632352 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632365 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632378 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632391 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632402 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632413 4713 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632421 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632430 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627392 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627715 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627641 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627860 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627920 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628020 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628070 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628207 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628234 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628313 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628322 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628471 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628533 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.633348 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628855 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628879 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629038 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629337 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629353 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629353 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629483 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629671 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629945 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629983 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629990 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630051 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630063 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630178 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630249 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630397 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630511 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630574 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630935 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630937 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630968 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631037 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631239 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631388 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631421 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631650 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631733 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631887 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631937 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631993 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632295 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632641 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632661 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632724 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632745 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632759 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632773 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632779 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.633168 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.633183 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.633216 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.633407 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.633659 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.633910 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.633930 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.634428 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.634482 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.634506 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.634589 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.634723 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.634809 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.634691 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.634962 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.635293 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.635348 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.635359 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.635504 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.635913 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.635940 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.635952 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.635972 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.635984 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:19Z","lastTransitionTime":"2026-03-08T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.636160 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.636172 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.636234 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.636311 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.636903 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.636983 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.637187 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.637196 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.637520 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.637529 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.637806 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.638017 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.638015 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.638128 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.638288 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.638513 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.638547 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.638804 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.638909 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:20.138894264 +0000 UTC m=+94.258526497 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.639279 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.639315 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:20.139305584 +0000 UTC m=+94.258937917 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.641172 4713 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.641805 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.641867 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.642524 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.649056 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.650339 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.650379 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.650414 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.650923 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.651195 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.652585 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.652624 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.652638 4713 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.652585 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.652693 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.652705 4713 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.652758 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:20.152739494 +0000 UTC m=+94.272371827 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.652791 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:20.152782365 +0000 UTC m=+94.272414718 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.657597 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.658031 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.659757 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.663211 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.663380 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.664499 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.665369 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.666175 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.666208 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.654674 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.667084 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.668038 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.669752 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.669943 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.670014 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.670324 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.670275 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.672934 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.674707 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.675458 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.675945 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.676174 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.676287 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.676474 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.676859 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.678973 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.679598 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.680913 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.681458 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.681507 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.681541 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.681585 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.681930 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.681982 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.692000 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.718387 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733432 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733473 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733527 4713 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733541 4713 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733550 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733558 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733567 4713 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733575 4713 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733584 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733578 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733593 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733650 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733660 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733670 4713 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733682 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733693 4713 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733704 4713 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733716 4713 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733729 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733734 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733739 4713 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733767 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733776 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733785 4713 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733792 4713 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733800 4713 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733808 4713 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733815 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733840 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733849 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733857 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733865 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733872 4713 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733881 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733890 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733898 4713 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733906 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733914 4713 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733923 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733932 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733939 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733948 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733957 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733965 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733973 4713 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733981 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733990 4713 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733998 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734006 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734015 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734023 4713 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734031 4713 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734040 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734049 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734059 4713 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734069 4713 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734077 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734084 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734093 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734101 4713 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734109 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734117 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734125 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734133 4713 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734141 4713 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734150 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734158 4713 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734166 4713 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734174 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734182 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734190 4713 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734198 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734206 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734213 4713 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734221 4713 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734230 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734238 4713 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734246 4713 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734254 4713 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734262 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734270 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734278 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734286 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734294 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734301 4713 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734310 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734318 4713 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734326 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734334 4713 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734344 4713 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734352 4713 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734360 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734368 4713 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734376 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734383 4713 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734391 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734399 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734409 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734417 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734425 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734432 4713 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734440 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734450 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734457 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734465 4713 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734472 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734480 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734488 4713 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734496 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734503 4713 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734511 4713 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734519 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734526 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734534 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734543 4713 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734550 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734558 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734565 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734573 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734580 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734588 4713 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734600 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734608 4713 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734616 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734624 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734632 4713 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734640 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734647 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734655 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.737886 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.737914 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.737922 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.737936 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.737946 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:19Z","lastTransitionTime":"2026-03-08T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.836817 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.839760 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.839877 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.839899 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.839930 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.839949 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:19Z","lastTransitionTime":"2026-03-08T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.843964 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.850615 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.866311 4713 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 00:07:19 crc kubenswrapper[4713]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 08 00:07:19 crc kubenswrapper[4713]: set -o allexport Mar 08 00:07:19 crc kubenswrapper[4713]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 08 00:07:19 crc kubenswrapper[4713]: source /etc/kubernetes/apiserver-url.env Mar 08 00:07:19 crc kubenswrapper[4713]: else Mar 08 00:07:19 crc kubenswrapper[4713]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 08 00:07:19 crc kubenswrapper[4713]: exit 1 Mar 08 00:07:19 crc kubenswrapper[4713]: fi Mar 08 00:07:19 crc kubenswrapper[4713]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 08 00:07:19 crc kubenswrapper[4713]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 00:07:19 crc kubenswrapper[4713]: > logger="UnhandledError" Mar 08 00:07:19 crc kubenswrapper[4713]: W0308 00:07:19.867281 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-df2b289f822bc76c592b8649c7aa06c65091beccf8d3647bc795e261789788bd WatchSource:0}: Error finding container df2b289f822bc76c592b8649c7aa06c65091beccf8d3647bc795e261789788bd: Status 404 returned error can't find the container with id df2b289f822bc76c592b8649c7aa06c65091beccf8d3647bc795e261789788bd Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.868322 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.871396 4713 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 00:07:19 crc kubenswrapper[4713]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 08 00:07:19 crc kubenswrapper[4713]: if [[ -f "/env/_master" ]]; then Mar 08 00:07:19 crc kubenswrapper[4713]: set -o allexport Mar 08 00:07:19 crc kubenswrapper[4713]: source "/env/_master" Mar 08 00:07:19 crc kubenswrapper[4713]: set +o allexport Mar 08 00:07:19 crc kubenswrapper[4713]: fi Mar 08 00:07:19 crc kubenswrapper[4713]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 08 00:07:19 crc kubenswrapper[4713]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 08 00:07:19 crc kubenswrapper[4713]: ho_enable="--enable-hybrid-overlay" Mar 08 00:07:19 crc kubenswrapper[4713]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 08 00:07:19 crc kubenswrapper[4713]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 08 00:07:19 crc kubenswrapper[4713]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 08 00:07:19 crc kubenswrapper[4713]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 08 00:07:19 crc kubenswrapper[4713]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 08 00:07:19 crc kubenswrapper[4713]: --webhook-host=127.0.0.1 \ Mar 08 00:07:19 crc kubenswrapper[4713]: --webhook-port=9743 \ Mar 08 00:07:19 crc kubenswrapper[4713]: ${ho_enable} \ Mar 08 00:07:19 crc kubenswrapper[4713]: --enable-interconnect \ Mar 08 00:07:19 crc kubenswrapper[4713]: --disable-approver \ Mar 08 00:07:19 crc kubenswrapper[4713]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 08 00:07:19 crc kubenswrapper[4713]: --wait-for-kubernetes-api=200s \ Mar 08 00:07:19 crc kubenswrapper[4713]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 08 00:07:19 crc kubenswrapper[4713]: --loglevel="${LOGLEVEL}" Mar 08 00:07:19 crc kubenswrapper[4713]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 00:07:19 crc kubenswrapper[4713]: > logger="UnhandledError" Mar 08 00:07:19 crc kubenswrapper[4713]: W0308 00:07:19.874049 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-abff03d113473cb80839f92e5900297db90b9d4f4c24015e7927eb14679f57b4 WatchSource:0}: Error finding container abff03d113473cb80839f92e5900297db90b9d4f4c24015e7927eb14679f57b4: Status 404 returned error can't find the container with id abff03d113473cb80839f92e5900297db90b9d4f4c24015e7927eb14679f57b4 Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.874552 4713 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 00:07:19 crc kubenswrapper[4713]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 08 00:07:19 crc kubenswrapper[4713]: if [[ -f "/env/_master" ]]; then Mar 08 00:07:19 crc kubenswrapper[4713]: set -o allexport Mar 08 00:07:19 crc kubenswrapper[4713]: source "/env/_master" Mar 08 00:07:19 crc kubenswrapper[4713]: set +o allexport Mar 08 00:07:19 crc kubenswrapper[4713]: fi Mar 08 00:07:19 crc kubenswrapper[4713]: Mar 08 00:07:19 crc kubenswrapper[4713]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 08 00:07:19 crc kubenswrapper[4713]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 08 00:07:19 crc kubenswrapper[4713]: --disable-webhook \ Mar 08 00:07:19 crc kubenswrapper[4713]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 08 00:07:19 crc kubenswrapper[4713]: --loglevel="${LOGLEVEL}" Mar 08 00:07:19 crc kubenswrapper[4713]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 00:07:19 crc kubenswrapper[4713]: > logger="UnhandledError" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.876731 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.878641 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.879712 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.880251 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"401214c15d0ba80cdf8afdf54687a96d22ba11f0fa3c96749c400fe814f51eb0"} Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.881764 4713 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 00:07:19 crc kubenswrapper[4713]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 08 00:07:19 crc kubenswrapper[4713]: set -o allexport Mar 08 00:07:19 crc kubenswrapper[4713]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 08 00:07:19 crc kubenswrapper[4713]: source /etc/kubernetes/apiserver-url.env Mar 08 00:07:19 crc kubenswrapper[4713]: else Mar 08 00:07:19 crc kubenswrapper[4713]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 08 00:07:19 crc kubenswrapper[4713]: exit 1 Mar 08 00:07:19 crc kubenswrapper[4713]: fi Mar 08 00:07:19 crc kubenswrapper[4713]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 08 00:07:19 crc kubenswrapper[4713]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 00:07:19 crc kubenswrapper[4713]: > logger="UnhandledError" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.883083 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"abff03d113473cb80839f92e5900297db90b9d4f4c24015e7927eb14679f57b4"} Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.883321 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.884202 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"df2b289f822bc76c592b8649c7aa06c65091beccf8d3647bc795e261789788bd"} Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.885601 4713 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 00:07:19 crc kubenswrapper[4713]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 08 00:07:19 crc kubenswrapper[4713]: if [[ -f "/env/_master" ]]; then Mar 08 00:07:19 crc kubenswrapper[4713]: set -o allexport Mar 08 00:07:19 crc kubenswrapper[4713]: source "/env/_master" Mar 08 00:07:19 crc kubenswrapper[4713]: set +o allexport Mar 08 00:07:19 crc kubenswrapper[4713]: fi Mar 08 00:07:19 crc kubenswrapper[4713]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 08 00:07:19 crc kubenswrapper[4713]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 08 00:07:19 crc kubenswrapper[4713]: ho_enable="--enable-hybrid-overlay" Mar 08 00:07:19 crc kubenswrapper[4713]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 08 00:07:19 crc kubenswrapper[4713]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 08 00:07:19 crc kubenswrapper[4713]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 08 00:07:19 crc kubenswrapper[4713]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 08 00:07:19 crc kubenswrapper[4713]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 08 00:07:19 crc kubenswrapper[4713]: --webhook-host=127.0.0.1 \ Mar 08 00:07:19 crc kubenswrapper[4713]: --webhook-port=9743 \ Mar 08 00:07:19 crc kubenswrapper[4713]: ${ho_enable} \ Mar 08 00:07:19 crc kubenswrapper[4713]: --enable-interconnect \ Mar 08 00:07:19 crc kubenswrapper[4713]: --disable-approver \ Mar 08 00:07:19 crc kubenswrapper[4713]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 08 00:07:19 crc kubenswrapper[4713]: --wait-for-kubernetes-api=200s \ Mar 08 00:07:19 crc kubenswrapper[4713]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 08 00:07:19 crc kubenswrapper[4713]: --loglevel="${LOGLEVEL}" Mar 08 00:07:19 crc kubenswrapper[4713]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 00:07:19 crc kubenswrapper[4713]: > logger="UnhandledError" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.891131 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.891882 4713 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 00:07:19 crc kubenswrapper[4713]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 08 00:07:19 crc kubenswrapper[4713]: if [[ -f "/env/_master" ]]; then Mar 08 00:07:19 crc kubenswrapper[4713]: set -o allexport Mar 08 00:07:19 crc kubenswrapper[4713]: source "/env/_master" Mar 08 00:07:19 crc kubenswrapper[4713]: set +o allexport Mar 08 00:07:19 crc kubenswrapper[4713]: fi Mar 08 00:07:19 crc kubenswrapper[4713]: Mar 08 00:07:19 crc kubenswrapper[4713]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 08 00:07:19 crc kubenswrapper[4713]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 08 00:07:19 crc kubenswrapper[4713]: --disable-webhook \ Mar 08 00:07:19 crc kubenswrapper[4713]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 08 00:07:19 crc kubenswrapper[4713]: --loglevel="${LOGLEVEL}" Mar 08 00:07:19 crc kubenswrapper[4713]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 00:07:19 crc kubenswrapper[4713]: > logger="UnhandledError" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.892381 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.893260 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.893593 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.902114 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.913459 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.923114 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.933066 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.942014 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.942061 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.942073 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.942091 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.942107 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:19Z","lastTransitionTime":"2026-03-08T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.942448 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.952232 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.961855 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.972927 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.982995 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.997682 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.006805 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.043792 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.043854 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.043865 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.043879 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.043889 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:20Z","lastTransitionTime":"2026-03-08T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.138371 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:07:20 crc kubenswrapper[4713]: E0308 00:07:20.138561 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:07:21.138543413 +0000 UTC m=+95.258175656 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.147147 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.147202 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.147212 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.147227 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.147241 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:20Z","lastTransitionTime":"2026-03-08T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.239790 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.239888 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.239931 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.239974 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:20 crc kubenswrapper[4713]: E0308 00:07:20.239999 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:07:20 crc kubenswrapper[4713]: E0308 00:07:20.240033 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:07:20 crc kubenswrapper[4713]: E0308 00:07:20.240046 4713 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:20 crc kubenswrapper[4713]: E0308 00:07:20.240074 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:07:20 crc kubenswrapper[4713]: E0308 00:07:20.240099 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:21.240080003 +0000 UTC m=+95.359712236 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:20 crc kubenswrapper[4713]: E0308 00:07:20.240096 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:07:20 crc kubenswrapper[4713]: E0308 00:07:20.240150 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:07:20 crc kubenswrapper[4713]: E0308 00:07:20.240171 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:21.240138755 +0000 UTC m=+95.359771028 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:07:20 crc kubenswrapper[4713]: E0308 00:07:20.240177 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:07:20 crc kubenswrapper[4713]: E0308 00:07:20.240202 4713 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:20 crc kubenswrapper[4713]: E0308 00:07:20.240213 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:21.240192116 +0000 UTC m=+95.359824509 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:07:20 crc kubenswrapper[4713]: E0308 00:07:20.240252 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:21.240234447 +0000 UTC m=+95.359866680 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.250396 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.250460 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.250486 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.250516 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.250537 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:20Z","lastTransitionTime":"2026-03-08T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.354034 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.354120 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.354131 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.354152 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.354166 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:20Z","lastTransitionTime":"2026-03-08T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.456849 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.456916 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.456927 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.456948 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.456961 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:20Z","lastTransitionTime":"2026-03-08T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.540658 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:20 crc kubenswrapper[4713]: E0308 00:07:20.540980 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.545325 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.546140 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.547129 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.547801 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.549391 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.549944 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.550651 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.551761 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.552578 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.553790 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.554644 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.555951 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.556697 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.557495 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.558258 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.559069 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.559941 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.560060 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.560099 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.560110 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.560129 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.560144 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:20Z","lastTransitionTime":"2026-03-08T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.560547 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.561367 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.562171 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.562804 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.566155 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.566792 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.568356 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.569035 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.570604 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.571541 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.572773 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.573599 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.574841 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.575523 4713 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.575669 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.578106 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.579527 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.580193 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.582612 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.584342 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.585113 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.586513 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.587862 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.589114 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.589907 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.591192 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.592049 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.593135 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.593863 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.594990 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.595965 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.597243 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.597944 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.599032 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.599704 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.600480 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.601533 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.664270 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.664359 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.664381 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.664413 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.664435 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:20Z","lastTransitionTime":"2026-03-08T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.768644 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.768706 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.768729 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.768765 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.768795 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:20Z","lastTransitionTime":"2026-03-08T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.871952 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.871991 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.872000 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.872014 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.872023 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:20Z","lastTransitionTime":"2026-03-08T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.975528 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.975595 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.975610 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.975634 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.975653 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:20Z","lastTransitionTime":"2026-03-08T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.078297 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.078405 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.078426 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.078493 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.078511 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:21Z","lastTransitionTime":"2026-03-08T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.147204 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.147425 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:07:23.147379412 +0000 UTC m=+97.267011675 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.181681 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.181762 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.181781 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.181811 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.181871 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:21Z","lastTransitionTime":"2026-03-08T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.248519 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.248562 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.248579 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.248597 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.248711 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.248783 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.248799 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.248810 4713 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.248723 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.248896 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:23.248855831 +0000 UTC m=+97.368488104 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.248890 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.248938 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:23.248918492 +0000 UTC m=+97.368550845 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.248958 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.248974 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:23.248956553 +0000 UTC m=+97.368588896 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.248985 4713 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.249107 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:23.249073906 +0000 UTC m=+97.368706179 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.285333 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.285395 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.285406 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.285429 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.285445 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:21Z","lastTransitionTime":"2026-03-08T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.388227 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.388289 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.388306 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.388329 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.388345 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:21Z","lastTransitionTime":"2026-03-08T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.491076 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.491142 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.491163 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.491188 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.491207 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:21Z","lastTransitionTime":"2026-03-08T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.539970 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.540005 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.540242 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.540390 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.553866 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.554317 4713 scope.go:117] "RemoveContainer" containerID="5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707" Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.554627 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.594307 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.594355 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.594369 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.594387 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.594399 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:21Z","lastTransitionTime":"2026-03-08T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.697038 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.697090 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.697103 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.697122 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.697135 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:21Z","lastTransitionTime":"2026-03-08T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.799800 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.799847 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.799856 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.799870 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.799880 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:21Z","lastTransitionTime":"2026-03-08T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.890131 4713 scope.go:117] "RemoveContainer" containerID="5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707" Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.890372 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.901470 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.901516 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.901525 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.901539 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.901551 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:21Z","lastTransitionTime":"2026-03-08T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.004861 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.004918 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.004937 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.004960 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.004978 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:22Z","lastTransitionTime":"2026-03-08T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.108665 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.108724 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.108741 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.108763 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.108781 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:22Z","lastTransitionTime":"2026-03-08T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.211149 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.211203 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.211213 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.211229 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.211240 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:22Z","lastTransitionTime":"2026-03-08T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.314001 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.314067 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.314090 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.314126 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.314151 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:22Z","lastTransitionTime":"2026-03-08T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.417146 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.417197 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.417214 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.417239 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.417258 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:22Z","lastTransitionTime":"2026-03-08T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.519789 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.519849 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.519859 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.519875 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.519886 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:22Z","lastTransitionTime":"2026-03-08T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.540171 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:22 crc kubenswrapper[4713]: E0308 00:07:22.540293 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.622543 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.622594 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.622608 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.622662 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.622689 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:22Z","lastTransitionTime":"2026-03-08T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.729241 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.729277 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.729289 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.729305 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.729316 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:22Z","lastTransitionTime":"2026-03-08T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.831507 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.831540 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.831547 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.831560 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.831568 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:22Z","lastTransitionTime":"2026-03-08T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.934465 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.934499 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.934509 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.934522 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.934549 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:22Z","lastTransitionTime":"2026-03-08T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.036448 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.036486 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.036494 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.036508 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.036516 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:23Z","lastTransitionTime":"2026-03-08T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.138634 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.138674 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.138686 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.138705 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.138718 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:23Z","lastTransitionTime":"2026-03-08T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.164141 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.164346 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:07:27.16431716 +0000 UTC m=+101.283949383 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.240672 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.240716 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.240729 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.240746 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.240759 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:23Z","lastTransitionTime":"2026-03-08T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.265455 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.265539 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.265584 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.265630 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.265694 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.265727 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.265739 4713 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.265765 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.265765 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.265805 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:27.265784899 +0000 UTC m=+101.385417132 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.265814 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.265848 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.265864 4713 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.265942 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:27.265896382 +0000 UTC m=+101.385528735 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.265992 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:27.265970214 +0000 UTC m=+101.385602487 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.266037 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:27.266022865 +0000 UTC m=+101.385655358 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.342640 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.342702 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.342720 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.342743 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.342762 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:23Z","lastTransitionTime":"2026-03-08T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.371818 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.371888 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.371989 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.372023 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.372032 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:23Z","lastTransitionTime":"2026-03-08T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.388176 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.394555 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.394601 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.394618 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.394643 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.394662 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:23Z","lastTransitionTime":"2026-03-08T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.410540 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.415318 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.415386 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.415408 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.415434 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.415456 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:23Z","lastTransitionTime":"2026-03-08T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.431092 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.436130 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.436201 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.436219 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.436244 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.436262 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:23Z","lastTransitionTime":"2026-03-08T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.450868 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.455214 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.455273 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.455292 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.455317 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.455337 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:23Z","lastTransitionTime":"2026-03-08T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.471727 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.471979 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.474209 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.474286 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.474338 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.474409 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.474435 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:23Z","lastTransitionTime":"2026-03-08T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.539983 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.540011 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.540157 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.540287 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.576839 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.576878 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.576888 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.576904 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.576916 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:23Z","lastTransitionTime":"2026-03-08T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.679349 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.679409 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.679427 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.679455 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.679472 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:23Z","lastTransitionTime":"2026-03-08T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.781932 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.781972 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.781984 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.782003 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.782013 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:23Z","lastTransitionTime":"2026-03-08T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.884713 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.885088 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.885305 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.885481 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.885689 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:23Z","lastTransitionTime":"2026-03-08T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.988554 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.988583 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.988591 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.988603 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.988612 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:23Z","lastTransitionTime":"2026-03-08T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.041198 4713 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.090917 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.090974 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.090991 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.091017 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.091040 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:24Z","lastTransitionTime":"2026-03-08T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.194639 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.195008 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.195204 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.195347 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.195477 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:24Z","lastTransitionTime":"2026-03-08T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.298913 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.299256 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.299446 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.299643 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.299819 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:24Z","lastTransitionTime":"2026-03-08T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.402892 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.402964 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.402982 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.403009 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.403031 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:24Z","lastTransitionTime":"2026-03-08T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.505365 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.505556 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.505623 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.505685 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.505738 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:24Z","lastTransitionTime":"2026-03-08T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.540841 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:24 crc kubenswrapper[4713]: E0308 00:07:24.541016 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.609019 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.609072 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.609083 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.609102 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.609113 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:24Z","lastTransitionTime":"2026-03-08T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.712633 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.712691 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.712707 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.712730 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.712746 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:24Z","lastTransitionTime":"2026-03-08T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.815493 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.815886 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.816093 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.816308 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.816488 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:24Z","lastTransitionTime":"2026-03-08T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.919114 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.919169 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.919186 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.919209 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.919228 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:24Z","lastTransitionTime":"2026-03-08T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.022466 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.022743 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.022898 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.023118 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.023383 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:25Z","lastTransitionTime":"2026-03-08T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.127223 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.127617 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.127919 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.128223 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.128395 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:25Z","lastTransitionTime":"2026-03-08T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.232392 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.232780 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.233206 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.233563 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.233977 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:25Z","lastTransitionTime":"2026-03-08T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.337749 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.338420 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.338602 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.338766 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.338990 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:25Z","lastTransitionTime":"2026-03-08T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.442001 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.442060 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.442082 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.442110 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.442134 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:25Z","lastTransitionTime":"2026-03-08T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.539901 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:25 crc kubenswrapper[4713]: E0308 00:07:25.540073 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.539900 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:25 crc kubenswrapper[4713]: E0308 00:07:25.540806 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.545366 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.545433 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.545451 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.545477 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.545497 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:25Z","lastTransitionTime":"2026-03-08T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.648147 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.648210 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.648236 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.648287 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.648310 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:25Z","lastTransitionTime":"2026-03-08T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.751181 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.751278 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.751297 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.751320 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.751337 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:25Z","lastTransitionTime":"2026-03-08T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.854068 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.854154 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.854178 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.854207 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.854229 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:25Z","lastTransitionTime":"2026-03-08T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.957261 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.957302 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.957313 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.957330 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.957341 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:25Z","lastTransitionTime":"2026-03-08T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.060205 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.060292 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.060311 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.060337 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.060355 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:26Z","lastTransitionTime":"2026-03-08T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.163074 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.163166 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.163186 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.163232 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.163266 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:26Z","lastTransitionTime":"2026-03-08T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.265979 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.266039 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.266060 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.266086 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.266103 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:26Z","lastTransitionTime":"2026-03-08T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.369645 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.369899 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.369933 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.369962 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.369987 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:26Z","lastTransitionTime":"2026-03-08T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.473234 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.473306 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.473323 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.473347 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.473364 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:26Z","lastTransitionTime":"2026-03-08T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.539986 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:26 crc kubenswrapper[4713]: E0308 00:07:26.540572 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.557520 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.573086 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.576963 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.577022 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.577040 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.577065 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.577085 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:26Z","lastTransitionTime":"2026-03-08T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.591572 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.609152 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.627312 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.645819 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.661568 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.679229 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.679257 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.679268 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.679284 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.679296 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:26Z","lastTransitionTime":"2026-03-08T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.782357 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.782416 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.782437 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.782488 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.782506 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:26Z","lastTransitionTime":"2026-03-08T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.885665 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.885723 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.885747 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.885777 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.885800 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:26Z","lastTransitionTime":"2026-03-08T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.988921 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.988970 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.988988 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.989012 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.989029 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:26Z","lastTransitionTime":"2026-03-08T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.091551 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.091595 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.091607 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.091624 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.091635 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:27Z","lastTransitionTime":"2026-03-08T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.194487 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.194561 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.194582 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.194608 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.194626 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:27Z","lastTransitionTime":"2026-03-08T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.201923 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:07:27 crc kubenswrapper[4713]: E0308 00:07:27.202093 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:07:35.202059706 +0000 UTC m=+109.321691989 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.297385 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.297434 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.297447 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.297464 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.297476 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:27Z","lastTransitionTime":"2026-03-08T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.302758 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.302844 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.302882 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:27 crc kubenswrapper[4713]: E0308 00:07:27.302915 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.302921 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:27 crc kubenswrapper[4713]: E0308 00:07:27.302974 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:35.30295601 +0000 UTC m=+109.422588253 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:07:27 crc kubenswrapper[4713]: E0308 00:07:27.303025 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:07:27 crc kubenswrapper[4713]: E0308 00:07:27.303049 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:07:27 crc kubenswrapper[4713]: E0308 00:07:27.303058 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:07:27 crc kubenswrapper[4713]: E0308 00:07:27.303091 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:35.303068123 +0000 UTC m=+109.422700386 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:07:27 crc kubenswrapper[4713]: E0308 00:07:27.303067 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:07:27 crc kubenswrapper[4713]: E0308 00:07:27.303117 4713 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:27 crc kubenswrapper[4713]: E0308 00:07:27.303176 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:35.303155455 +0000 UTC m=+109.422787718 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:27 crc kubenswrapper[4713]: E0308 00:07:27.303096 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:07:27 crc kubenswrapper[4713]: E0308 00:07:27.303207 4713 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:27 crc kubenswrapper[4713]: E0308 00:07:27.303245 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:35.303233307 +0000 UTC m=+109.422865570 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.400331 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.400407 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.400436 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.400469 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.400493 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:27Z","lastTransitionTime":"2026-03-08T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.504613 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.505001 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.505214 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.505437 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.505638 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:27Z","lastTransitionTime":"2026-03-08T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.540046 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:27 crc kubenswrapper[4713]: E0308 00:07:27.540237 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.541062 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:27 crc kubenswrapper[4713]: E0308 00:07:27.541256 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.608614 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.608998 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.609462 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.609684 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.609938 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:27Z","lastTransitionTime":"2026-03-08T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.713537 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.713596 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.713619 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.713649 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.713672 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:27Z","lastTransitionTime":"2026-03-08T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.817449 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.818294 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.818481 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.818636 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.818774 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:27Z","lastTransitionTime":"2026-03-08T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.921508 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.921552 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.921561 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.921576 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.921587 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:27Z","lastTransitionTime":"2026-03-08T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.023815 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.023873 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.023883 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.023898 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.023910 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:28Z","lastTransitionTime":"2026-03-08T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.127169 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.127238 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.127255 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.127281 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.127300 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:28Z","lastTransitionTime":"2026-03-08T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.229000 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.229037 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.229046 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.229059 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.229068 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:28Z","lastTransitionTime":"2026-03-08T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.332083 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.332143 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.332157 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.332175 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.332188 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:28Z","lastTransitionTime":"2026-03-08T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.434969 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.435067 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.435086 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.435150 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.435172 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:28Z","lastTransitionTime":"2026-03-08T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.537934 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.537987 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.538009 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.538027 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.538041 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:28Z","lastTransitionTime":"2026-03-08T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.540554 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:28 crc kubenswrapper[4713]: E0308 00:07:28.540761 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.640459 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.640500 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.640509 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.640525 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.640535 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:28Z","lastTransitionTime":"2026-03-08T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.743362 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.743490 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.743519 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.743561 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.743586 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:28Z","lastTransitionTime":"2026-03-08T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.846755 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.846883 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.846920 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.846950 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.846967 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:28Z","lastTransitionTime":"2026-03-08T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.949708 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.949776 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.949794 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.949817 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.949869 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:28Z","lastTransitionTime":"2026-03-08T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.053143 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.053220 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.053256 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.053287 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.053307 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:29Z","lastTransitionTime":"2026-03-08T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.156566 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.156666 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.156684 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.156710 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.156730 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:29Z","lastTransitionTime":"2026-03-08T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.259296 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.259356 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.259373 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.259396 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.259413 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:29Z","lastTransitionTime":"2026-03-08T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.361910 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.361971 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.361993 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.362021 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.362039 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:29Z","lastTransitionTime":"2026-03-08T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.465526 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.465610 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.465637 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.465671 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.465697 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:29Z","lastTransitionTime":"2026-03-08T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.540107 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:29 crc kubenswrapper[4713]: E0308 00:07:29.540287 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.540775 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:29 crc kubenswrapper[4713]: E0308 00:07:29.540955 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.568644 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.568694 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.568711 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.568735 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.568752 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:29Z","lastTransitionTime":"2026-03-08T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.671208 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.671498 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.671650 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.671764 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.671873 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:29Z","lastTransitionTime":"2026-03-08T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.775136 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.775206 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.775228 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.775257 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.775279 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:29Z","lastTransitionTime":"2026-03-08T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.878528 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.878588 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.878605 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.878631 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.878650 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:29Z","lastTransitionTime":"2026-03-08T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.981325 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.981663 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.981792 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.982010 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.982155 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:29Z","lastTransitionTime":"2026-03-08T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.085710 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.085780 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.085875 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.085912 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.085931 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:30Z","lastTransitionTime":"2026-03-08T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.189480 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.189562 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.189586 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.189627 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.189650 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:30Z","lastTransitionTime":"2026-03-08T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.292386 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.292448 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.292465 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.292488 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.292507 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:30Z","lastTransitionTime":"2026-03-08T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.395381 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.395457 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.395475 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.395506 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.395526 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:30Z","lastTransitionTime":"2026-03-08T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.501006 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.501158 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.501180 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.501208 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.501260 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:30Z","lastTransitionTime":"2026-03-08T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.541348 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:30 crc kubenswrapper[4713]: E0308 00:07:30.541775 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.604871 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.604977 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.605017 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.605059 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.605087 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:30Z","lastTransitionTime":"2026-03-08T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.708486 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.708774 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.708942 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.709066 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.709171 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:30Z","lastTransitionTime":"2026-03-08T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.812461 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.812555 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.812609 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.812634 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.812687 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:30Z","lastTransitionTime":"2026-03-08T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.914661 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.915540 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.915708 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.915880 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.916067 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:30Z","lastTransitionTime":"2026-03-08T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.019724 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.020114 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.020272 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.020427 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.020548 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:31Z","lastTransitionTime":"2026-03-08T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.123742 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.123793 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.123810 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.123858 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.123880 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:31Z","lastTransitionTime":"2026-03-08T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.226131 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.226182 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.226193 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.226211 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.226222 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:31Z","lastTransitionTime":"2026-03-08T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.329256 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.329337 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.329361 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.329396 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.329422 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:31Z","lastTransitionTime":"2026-03-08T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.432413 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.432539 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.432565 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.432596 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.432623 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:31Z","lastTransitionTime":"2026-03-08T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.539976 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.540022 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:31 crc kubenswrapper[4713]: E0308 00:07:31.540235 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:31 crc kubenswrapper[4713]: E0308 00:07:31.540574 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.542141 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.542203 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.542226 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.542256 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.542278 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:31Z","lastTransitionTime":"2026-03-08T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.645738 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.645919 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.645950 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.645990 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.646018 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:31Z","lastTransitionTime":"2026-03-08T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.749026 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.749123 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.749141 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.749165 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.749181 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:31Z","lastTransitionTime":"2026-03-08T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.853111 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.853201 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.853225 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.853256 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.853284 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:31Z","lastTransitionTime":"2026-03-08T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.957397 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.957466 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.957487 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.957511 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.957528 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:31Z","lastTransitionTime":"2026-03-08T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.061240 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.061624 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.061793 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.062040 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.062267 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:32Z","lastTransitionTime":"2026-03-08T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.165122 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.165172 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.165185 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.165202 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.165213 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:32Z","lastTransitionTime":"2026-03-08T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.269083 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.269180 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.269203 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.269232 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.269250 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:32Z","lastTransitionTime":"2026-03-08T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.372229 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.372396 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.372423 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.372452 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.372474 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:32Z","lastTransitionTime":"2026-03-08T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.474732 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.474854 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.474885 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.474915 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.474932 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:32Z","lastTransitionTime":"2026-03-08T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.540515 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:32 crc kubenswrapper[4713]: E0308 00:07:32.540710 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.578176 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.578246 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.578269 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.578298 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.578320 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:32Z","lastTransitionTime":"2026-03-08T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.681908 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.681969 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.681989 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.682015 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.682036 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:32Z","lastTransitionTime":"2026-03-08T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.784670 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.784727 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.784745 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.784768 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.784785 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:32Z","lastTransitionTime":"2026-03-08T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.886979 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.887024 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.887036 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.887057 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.887070 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:32Z","lastTransitionTime":"2026-03-08T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.925334 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d"} Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.925388 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a"} Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.941148 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.955607 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.970898 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.987928 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.989869 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.989903 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.989915 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.989931 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.989943 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:32Z","lastTransitionTime":"2026-03-08T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.002257 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.011669 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.021938 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.091989 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.092040 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.092051 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.092069 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.092082 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:33Z","lastTransitionTime":"2026-03-08T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.194349 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.194685 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.194698 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.194714 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.194725 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:33Z","lastTransitionTime":"2026-03-08T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.296504 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.296550 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.296561 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.296593 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.296605 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:33Z","lastTransitionTime":"2026-03-08T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.398995 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.399036 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.399046 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.399061 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.399071 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:33Z","lastTransitionTime":"2026-03-08T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.502024 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.502067 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.502078 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.502095 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.502108 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:33Z","lastTransitionTime":"2026-03-08T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.540603 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:33 crc kubenswrapper[4713]: E0308 00:07:33.540751 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.541249 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:33 crc kubenswrapper[4713]: E0308 00:07:33.541363 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.541793 4713 scope.go:117] "RemoveContainer" containerID="5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707" Mar 08 00:07:33 crc kubenswrapper[4713]: E0308 00:07:33.541976 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.604789 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.604885 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.604903 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.604928 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.604945 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:33Z","lastTransitionTime":"2026-03-08T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.708326 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.708379 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.708398 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.708420 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.708436 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:33Z","lastTransitionTime":"2026-03-08T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.771542 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-fp2h2"] Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.771912 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fp2h2" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.774928 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.774965 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.775067 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.796140 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.811695 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.811729 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.811738 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.811752 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.811762 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:33Z","lastTransitionTime":"2026-03-08T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.812203 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.830900 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.845455 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.865587 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.866896 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/34185fa0-b348-45e6-990e-4bb01410d564-hosts-file\") pod \"node-resolver-fp2h2\" (UID: \"34185fa0-b348-45e6-990e-4bb01410d564\") " pod="openshift-dns/node-resolver-fp2h2" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.866968 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk47b\" (UniqueName: \"kubernetes.io/projected/34185fa0-b348-45e6-990e-4bb01410d564-kube-api-access-lk47b\") pod \"node-resolver-fp2h2\" (UID: \"34185fa0-b348-45e6-990e-4bb01410d564\") " pod="openshift-dns/node-resolver-fp2h2" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.871018 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.871095 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.871121 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.871198 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.871227 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:33Z","lastTransitionTime":"2026-03-08T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.880922 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:33 crc kubenswrapper[4713]: E0308 00:07:33.888162 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.893091 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.893127 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.893138 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.893154 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.893166 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:33Z","lastTransitionTime":"2026-03-08T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.897444 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:33 crc kubenswrapper[4713]: E0308 00:07:33.911972 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.912646 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.915674 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.915706 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.915718 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.915736 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.915751 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:33Z","lastTransitionTime":"2026-03-08T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.929873 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531"} Mar 08 00:07:33 crc kubenswrapper[4713]: E0308 00:07:33.931917 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.936332 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.936563 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.936694 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.936858 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.936989 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:33Z","lastTransitionTime":"2026-03-08T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.947893 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:33 crc kubenswrapper[4713]: E0308 00:07:33.953226 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.956979 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.957019 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.957033 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.957049 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.957061 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:33Z","lastTransitionTime":"2026-03-08T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.966656 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.968333 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/34185fa0-b348-45e6-990e-4bb01410d564-hosts-file\") pod \"node-resolver-fp2h2\" (UID: \"34185fa0-b348-45e6-990e-4bb01410d564\") " pod="openshift-dns/node-resolver-fp2h2" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.968397 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk47b\" (UniqueName: \"kubernetes.io/projected/34185fa0-b348-45e6-990e-4bb01410d564-kube-api-access-lk47b\") pod \"node-resolver-fp2h2\" (UID: \"34185fa0-b348-45e6-990e-4bb01410d564\") " pod="openshift-dns/node-resolver-fp2h2" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.968657 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/34185fa0-b348-45e6-990e-4bb01410d564-hosts-file\") pod \"node-resolver-fp2h2\" (UID: \"34185fa0-b348-45e6-990e-4bb01410d564\") " pod="openshift-dns/node-resolver-fp2h2" Mar 08 00:07:33 crc kubenswrapper[4713]: E0308 00:07:33.973173 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:33 crc kubenswrapper[4713]: E0308 00:07:33.973299 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.974614 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.974644 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.974653 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.974667 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.974677 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:33Z","lastTransitionTime":"2026-03-08T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.983630 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.988717 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk47b\" (UniqueName: \"kubernetes.io/projected/34185fa0-b348-45e6-990e-4bb01410d564-kube-api-access-lk47b\") pod \"node-resolver-fp2h2\" (UID: \"34185fa0-b348-45e6-990e-4bb01410d564\") " pod="openshift-dns/node-resolver-fp2h2" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.001200 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.017428 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.033296 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.045705 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.058984 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.076687 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.076753 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.076770 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.076794 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.076816 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:34Z","lastTransitionTime":"2026-03-08T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.093114 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fp2h2" Mar 08 00:07:34 crc kubenswrapper[4713]: W0308 00:07:34.109773 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34185fa0_b348_45e6_990e_4bb01410d564.slice/crio-97f7ff49b6fee4f7a5ed851a9363423614f03c188a5f1171e72af244bf688d49 WatchSource:0}: Error finding container 97f7ff49b6fee4f7a5ed851a9363423614f03c188a5f1171e72af244bf688d49: Status 404 returned error can't find the container with id 97f7ff49b6fee4f7a5ed851a9363423614f03c188a5f1171e72af244bf688d49 Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.153378 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-4kr8v"] Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.156580 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-fh96f"] Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.157632 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-54zzt"] Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.157869 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.157768 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.160351 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.160439 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.160869 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.162342 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.162388 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.162698 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.162890 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.163184 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.163293 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.163561 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.164304 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.174734 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.176753 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.180097 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.180139 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.180156 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.180174 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.180188 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:34Z","lastTransitionTime":"2026-03-08T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.185765 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.198003 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.210986 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.258208 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271363 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlmxl\" (UniqueName: \"kubernetes.io/projected/5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76-kube-api-access-zlmxl\") pod \"machine-config-daemon-4kr8v\" (UID: \"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\") " pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271450 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-multus-conf-dir\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271478 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-system-cni-dir\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271499 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-cnibin\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271536 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-run-k8s-cni-cncf-io\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271556 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-system-cni-dir\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271569 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-multus-socket-dir-parent\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271582 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-etc-kubernetes\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271595 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-os-release\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271608 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271623 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-var-lib-cni-bin\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271659 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-tuning-conf-dir\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271679 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-var-lib-kubelet\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271701 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-hostroot\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271716 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-var-lib-cni-multus\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271737 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-cni-binary-copy\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271751 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv9p9\" (UniqueName: \"kubernetes.io/projected/bf95e3f7-808b-434f-8fd4-c7e7365a1561-kube-api-access-bv9p9\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271781 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-run-multus-certs\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271796 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76-mcd-auth-proxy-config\") pod \"machine-config-daemon-4kr8v\" (UID: \"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\") " pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271838 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-os-release\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271857 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-928t2\" (UniqueName: \"kubernetes.io/projected/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-kube-api-access-928t2\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271876 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76-proxy-tls\") pod \"machine-config-daemon-4kr8v\" (UID: \"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\") " pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271894 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-multus-cni-dir\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271914 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bf95e3f7-808b-434f-8fd4-c7e7365a1561-cni-binary-copy\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271932 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bf95e3f7-808b-434f-8fd4-c7e7365a1561-multus-daemon-config\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271950 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76-rootfs\") pod \"machine-config-daemon-4kr8v\" (UID: \"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\") " pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271967 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-cnibin\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271986 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-run-netns\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.272292 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.285028 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.285055 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.285063 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.285075 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.285084 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:34Z","lastTransitionTime":"2026-03-08T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.297375 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.312618 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.326930 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.341474 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.354868 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.367135 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372549 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-cni-binary-copy\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372588 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv9p9\" (UniqueName: \"kubernetes.io/projected/bf95e3f7-808b-434f-8fd4-c7e7365a1561-kube-api-access-bv9p9\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372605 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-run-multus-certs\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372624 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76-mcd-auth-proxy-config\") pod \"machine-config-daemon-4kr8v\" (UID: \"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\") " pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372640 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-os-release\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372658 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-928t2\" (UniqueName: \"kubernetes.io/projected/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-kube-api-access-928t2\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372672 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bf95e3f7-808b-434f-8fd4-c7e7365a1561-cni-binary-copy\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372690 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bf95e3f7-808b-434f-8fd4-c7e7365a1561-multus-daemon-config\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372705 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76-proxy-tls\") pod \"machine-config-daemon-4kr8v\" (UID: \"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\") " pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372722 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-multus-cni-dir\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372737 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76-rootfs\") pod \"machine-config-daemon-4kr8v\" (UID: \"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\") " pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372756 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-cnibin\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372777 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-run-netns\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372808 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlmxl\" (UniqueName: \"kubernetes.io/projected/5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76-kube-api-access-zlmxl\") pod \"machine-config-daemon-4kr8v\" (UID: \"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\") " pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372812 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-os-release\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372839 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-multus-conf-dir\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372886 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-cnibin\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372861 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-multus-conf-dir\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372909 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-system-cni-dir\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372936 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-system-cni-dir\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372951 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-run-k8s-cni-cncf-io\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372961 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-cnibin\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372972 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-cnibin\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372980 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-system-cni-dir\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372998 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-multus-socket-dir-parent\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373006 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76-rootfs\") pod \"machine-config-daemon-4kr8v\" (UID: \"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\") " pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373014 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-etc-kubernetes\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373033 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-var-lib-cni-bin\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373044 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-system-cni-dir\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373053 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-os-release\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373068 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373084 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-var-lib-kubelet\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373771 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-hostroot\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373805 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-tuning-conf-dir\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373835 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-var-lib-cni-multus\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373130 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-multus-cni-dir\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373149 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-multus-socket-dir-parent\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373168 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-etc-kubernetes\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373241 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-var-lib-kubelet\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373070 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-run-k8s-cni-cncf-io\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373317 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-os-release\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373402 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76-mcd-auth-proxy-config\") pod \"machine-config-daemon-4kr8v\" (UID: \"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\") " pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373402 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-cni-binary-copy\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373740 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bf95e3f7-808b-434f-8fd4-c7e7365a1561-cni-binary-copy\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373741 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373090 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-run-netns\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373963 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-hostroot\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372754 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-run-multus-certs\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.374327 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-var-lib-cni-multus\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373122 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-var-lib-cni-bin\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.375179 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bf95e3f7-808b-434f-8fd4-c7e7365a1561-multus-daemon-config\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.377498 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-tuning-conf-dir\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.378209 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76-proxy-tls\") pod \"machine-config-daemon-4kr8v\" (UID: \"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\") " pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.380188 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.387296 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.387328 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.387336 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.387352 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.387364 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:34Z","lastTransitionTime":"2026-03-08T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.388107 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-928t2\" (UniqueName: \"kubernetes.io/projected/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-kube-api-access-928t2\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.393947 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.396405 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlmxl\" (UniqueName: \"kubernetes.io/projected/5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76-kube-api-access-zlmxl\") pod \"machine-config-daemon-4kr8v\" (UID: \"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\") " pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.397150 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv9p9\" (UniqueName: \"kubernetes.io/projected/bf95e3f7-808b-434f-8fd4-c7e7365a1561-kube-api-access-bv9p9\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.406610 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.418064 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.429100 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.440692 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.450182 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.461410 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.476252 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.487033 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.490037 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.490058 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.490066 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.490078 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.490086 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:34Z","lastTransitionTime":"2026-03-08T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.490435 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.499581 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.507387 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: W0308 00:07:34.518985 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf95e3f7_808b_434f_8fd4_c7e7365a1561.slice/crio-d52720c7b61f103d964e37454c76bb3c47479686b9097705fdcc71ba15fa3542 WatchSource:0}: Error finding container d52720c7b61f103d964e37454c76bb3c47479686b9097705fdcc71ba15fa3542: Status 404 returned error can't find the container with id d52720c7b61f103d964e37454c76bb3c47479686b9097705fdcc71ba15fa3542 Mar 08 00:07:34 crc kubenswrapper[4713]: W0308 00:07:34.532091 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7dbbe8c_4ae1_4a6b_9b62_eac6a5c73205.slice/crio-93cf361bb8ca9fd708c5a2d407009e480d619b1eb23e60fab80652ad44ce55a1 WatchSource:0}: Error finding container 93cf361bb8ca9fd708c5a2d407009e480d619b1eb23e60fab80652ad44ce55a1: Status 404 returned error can't find the container with id 93cf361bb8ca9fd708c5a2d407009e480d619b1eb23e60fab80652ad44ce55a1 Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.541169 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:34 crc kubenswrapper[4713]: E0308 00:07:34.541298 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.559551 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gsfft"] Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.560332 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.564282 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.564600 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.564929 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.565192 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.565535 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.565578 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.565611 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.576570 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.589096 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.592451 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.592493 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.592504 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.592522 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.592536 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:34Z","lastTransitionTime":"2026-03-08T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.600960 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.611547 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.628089 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.642224 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.655130 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.668900 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676084 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-var-lib-openvswitch\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676113 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676130 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-env-overrides\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676149 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-openvswitch\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676163 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-run-ovn-kubernetes\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676186 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-systemd-units\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676201 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-node-log\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676216 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-cni-bin\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676284 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-systemd\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676315 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-etc-openvswitch\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676402 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-ovnkube-config\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676486 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-run-netns\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676542 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-log-socket\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676577 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-ovnkube-script-lib\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676615 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl27z\" (UniqueName: \"kubernetes.io/projected/56fbba07-87e8-4e77-b834-ed68af718d11-kube-api-access-zl27z\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676651 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/56fbba07-87e8-4e77-b834-ed68af718d11-ovn-node-metrics-cert\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676710 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-ovn\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676754 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-kubelet\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676809 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-cni-netd\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676868 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-slash\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.683777 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.694533 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.694572 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.694581 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.694594 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.694607 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:34Z","lastTransitionTime":"2026-03-08T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.696657 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.714205 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.743608 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777416 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-ovn\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777488 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-ovn\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777534 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-kubelet\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777553 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-cni-netd\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777603 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-kubelet\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777572 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-slash\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777658 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-var-lib-openvswitch\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777689 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777704 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-env-overrides\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777723 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-openvswitch\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777716 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-cni-netd\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777725 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-slash\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777761 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777793 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-run-ovn-kubernetes\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777804 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-var-lib-openvswitch\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777811 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-systemd-units\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777875 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-systemd-units\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777877 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-systemd\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777897 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-openvswitch\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777902 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-node-log\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777922 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-cni-bin\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777942 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-systemd\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777952 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-run-netns\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777966 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-cni-bin\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777974 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-etc-openvswitch\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777993 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-run-netns\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777996 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-ovnkube-config\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.778019 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-etc-openvswitch\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.778022 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-ovnkube-script-lib\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.778023 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-node-log\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.778044 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl27z\" (UniqueName: \"kubernetes.io/projected/56fbba07-87e8-4e77-b834-ed68af718d11-kube-api-access-zl27z\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.778114 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-log-socket\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.778139 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/56fbba07-87e8-4e77-b834-ed68af718d11-ovn-node-metrics-cert\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.778333 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-env-overrides\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.778702 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-ovnkube-config\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.778193 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-log-socket\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.778997 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-ovnkube-script-lib\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.779039 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-run-ovn-kubernetes\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.783652 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/56fbba07-87e8-4e77-b834-ed68af718d11-ovn-node-metrics-cert\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.796624 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.796675 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.796686 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.796701 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.796711 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:34Z","lastTransitionTime":"2026-03-08T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.800108 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl27z\" (UniqueName: \"kubernetes.io/projected/56fbba07-87e8-4e77-b834-ed68af718d11-kube-api-access-zl27z\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.899060 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.899102 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.899114 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.899128 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.899138 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:34Z","lastTransitionTime":"2026-03-08T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.933714 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" event={"ID":"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205","Type":"ContainerStarted","Data":"c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.933764 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" event={"ID":"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205","Type":"ContainerStarted","Data":"93cf361bb8ca9fd708c5a2d407009e480d619b1eb23e60fab80652ad44ce55a1"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.935776 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fp2h2" event={"ID":"34185fa0-b348-45e6-990e-4bb01410d564","Type":"ContainerStarted","Data":"edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.935861 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fp2h2" event={"ID":"34185fa0-b348-45e6-990e-4bb01410d564","Type":"ContainerStarted","Data":"97f7ff49b6fee4f7a5ed851a9363423614f03c188a5f1171e72af244bf688d49"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.937623 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerStarted","Data":"31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.937669 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerStarted","Data":"ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.937682 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerStarted","Data":"c3ada8a6a2b79759353dfd8087cd376ccb54b5781a552e2c181132bd8987a990"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.939059 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fh96f" event={"ID":"bf95e3f7-808b-434f-8fd4-c7e7365a1561","Type":"ContainerStarted","Data":"f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.939087 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fh96f" event={"ID":"bf95e3f7-808b-434f-8fd4-c7e7365a1561","Type":"ContainerStarted","Data":"d52720c7b61f103d964e37454c76bb3c47479686b9097705fdcc71ba15fa3542"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.952957 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.965319 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.965519 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.977965 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: W0308 00:07:34.979705 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56fbba07_87e8_4e77_b834_ed68af718d11.slice/crio-6355753be9662030b1350e38ca6fc0620acd7ba140b99c59577d4d942dd0976d WatchSource:0}: Error finding container 6355753be9662030b1350e38ca6fc0620acd7ba140b99c59577d4d942dd0976d: Status 404 returned error can't find the container with id 6355753be9662030b1350e38ca6fc0620acd7ba140b99c59577d4d942dd0976d Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.001733 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.001775 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.001786 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.001802 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.001813 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:35Z","lastTransitionTime":"2026-03-08T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.005392 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.029778 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.043409 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.055147 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.066900 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.079281 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.094869 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.103899 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.103937 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.103951 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.103967 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.103977 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:35Z","lastTransitionTime":"2026-03-08T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.118636 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.134050 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.153662 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.174664 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.191434 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.206595 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.206656 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.206669 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.206697 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.206710 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:35Z","lastTransitionTime":"2026-03-08T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.207508 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.221337 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.234921 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.248688 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.263607 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.279978 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.282525 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:07:35 crc kubenswrapper[4713]: E0308 00:07:35.282739 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:07:51.282691819 +0000 UTC m=+125.402324052 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.292648 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.305687 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.309457 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.309509 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.309527 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.309554 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.309573 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:35Z","lastTransitionTime":"2026-03-08T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.318648 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.383569 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.383692 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.383753 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:35 crc kubenswrapper[4713]: E0308 00:07:35.383777 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:07:35 crc kubenswrapper[4713]: E0308 00:07:35.383816 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:07:35 crc kubenswrapper[4713]: E0308 00:07:35.383850 4713 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.383815 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:35 crc kubenswrapper[4713]: E0308 00:07:35.383919 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:51.383900541 +0000 UTC m=+125.503532774 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:35 crc kubenswrapper[4713]: E0308 00:07:35.383923 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:07:35 crc kubenswrapper[4713]: E0308 00:07:35.383969 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:51.383959293 +0000 UTC m=+125.503591526 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:07:35 crc kubenswrapper[4713]: E0308 00:07:35.383924 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:07:35 crc kubenswrapper[4713]: E0308 00:07:35.384078 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:07:35 crc kubenswrapper[4713]: E0308 00:07:35.384076 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:07:35 crc kubenswrapper[4713]: E0308 00:07:35.384128 4713 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:35 crc kubenswrapper[4713]: E0308 00:07:35.384193 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:51.384154777 +0000 UTC m=+125.503787160 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:07:35 crc kubenswrapper[4713]: E0308 00:07:35.384228 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:51.384211109 +0000 UTC m=+125.503843582 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.412462 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.412537 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.412555 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.412581 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.412599 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:35Z","lastTransitionTime":"2026-03-08T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.515728 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.515815 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.515868 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.515905 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.515935 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:35Z","lastTransitionTime":"2026-03-08T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.540317 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.540333 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:35 crc kubenswrapper[4713]: E0308 00:07:35.540633 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:35 crc kubenswrapper[4713]: E0308 00:07:35.540779 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.620254 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.620294 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.620303 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.620323 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.620333 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:35Z","lastTransitionTime":"2026-03-08T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.722888 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.722954 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.722968 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.722990 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.723004 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:35Z","lastTransitionTime":"2026-03-08T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.825508 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.825549 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.825560 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.825578 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.825587 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:35Z","lastTransitionTime":"2026-03-08T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.929002 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.929294 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.929305 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.929324 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.929338 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:35Z","lastTransitionTime":"2026-03-08T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.942139 4713 generic.go:334] "Generic (PLEG): container finished" podID="56fbba07-87e8-4e77-b834-ed68af718d11" containerID="13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d" exitCode=0 Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.942204 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerDied","Data":"13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d"} Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.942229 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerStarted","Data":"6355753be9662030b1350e38ca6fc0620acd7ba140b99c59577d4d942dd0976d"} Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.943679 4713 generic.go:334] "Generic (PLEG): container finished" podID="d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205" containerID="c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425" exitCode=0 Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.944156 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" event={"ID":"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205","Type":"ContainerDied","Data":"c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425"} Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.962263 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.987103 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.999258 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.009626 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.022445 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.048974 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.060285 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.060340 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.060353 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.060370 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.060385 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:36Z","lastTransitionTime":"2026-03-08T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.086850 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.103395 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.115202 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.123340 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.133272 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.148454 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.158979 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.162101 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.162134 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.162141 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.162154 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.162162 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:36Z","lastTransitionTime":"2026-03-08T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.171085 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.184724 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.199068 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.213967 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.226082 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.242973 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.255251 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.264594 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.264619 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.264644 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.264658 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.264668 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:36Z","lastTransitionTime":"2026-03-08T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.266867 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.276860 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.290839 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.308044 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.367027 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.367078 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.367089 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.367103 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.367112 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:36Z","lastTransitionTime":"2026-03-08T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.469006 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.469032 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.469040 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.469052 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.469061 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:36Z","lastTransitionTime":"2026-03-08T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.540731 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:36 crc kubenswrapper[4713]: E0308 00:07:36.541046 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.562785 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.572776 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.572813 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.572855 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.572874 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.572885 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:36Z","lastTransitionTime":"2026-03-08T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.583385 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.603587 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.615555 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.635762 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.650934 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.664229 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.674713 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.674747 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.674758 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.674776 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.674789 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:36Z","lastTransitionTime":"2026-03-08T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.677955 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.692870 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.704492 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.716444 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.727431 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.776891 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.777108 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.777116 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.777130 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.777138 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:36Z","lastTransitionTime":"2026-03-08T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.880281 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.880332 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.880341 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.880355 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.880364 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:36Z","lastTransitionTime":"2026-03-08T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.950938 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerStarted","Data":"2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.951295 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerStarted","Data":"8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.951336 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerStarted","Data":"dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.951625 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerStarted","Data":"b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.951646 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerStarted","Data":"2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.951973 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerStarted","Data":"141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.952901 4713 generic.go:334] "Generic (PLEG): container finished" podID="d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205" containerID="41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79" exitCode=0 Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.952949 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" event={"ID":"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205","Type":"ContainerDied","Data":"41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.974860 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.982038 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.982073 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.982087 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.982107 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.982122 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:36Z","lastTransitionTime":"2026-03-08T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.989299 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.998791 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.012027 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:37Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.027702 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:37Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.041175 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:37Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.053128 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:37Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.066854 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:37Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.078375 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:37Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.084672 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.084710 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.084723 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.084742 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.084753 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:37Z","lastTransitionTime":"2026-03-08T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.091773 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:37Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.137198 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:37Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.164189 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:37Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.187205 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.187256 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.187328 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.187349 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.187364 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:37Z","lastTransitionTime":"2026-03-08T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.295323 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.295373 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.295386 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.295402 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.295414 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:37Z","lastTransitionTime":"2026-03-08T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.399746 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.399847 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.399872 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.399899 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.399921 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:37Z","lastTransitionTime":"2026-03-08T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.504994 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.505047 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.505062 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.505083 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.505099 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:37Z","lastTransitionTime":"2026-03-08T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.540645 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.540728 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:37 crc kubenswrapper[4713]: E0308 00:07:37.540819 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:37 crc kubenswrapper[4713]: E0308 00:07:37.541025 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.608271 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.608348 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.608374 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.608405 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.608428 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:37Z","lastTransitionTime":"2026-03-08T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.710804 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.710909 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.710931 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.710960 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.710982 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:37Z","lastTransitionTime":"2026-03-08T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.812697 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.812751 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.812764 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.812780 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.812794 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:37Z","lastTransitionTime":"2026-03-08T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.915243 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.915279 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.915289 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.915304 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.915314 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:37Z","lastTransitionTime":"2026-03-08T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.957244 4713 generic.go:334] "Generic (PLEG): container finished" podID="d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205" containerID="0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2" exitCode=0 Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.957289 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" event={"ID":"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205","Type":"ContainerDied","Data":"0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2"} Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.979928 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:37Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.004896 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:38Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.017160 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.017624 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.017636 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.017649 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.017660 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:38Z","lastTransitionTime":"2026-03-08T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.024366 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:38Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.036435 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:38Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.050014 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:38Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.060039 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:38Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.072906 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:38Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.085147 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:38Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.096014 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:38Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.106671 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:38Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.120561 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:38Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.122093 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.122126 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.122149 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.122165 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.122176 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:38Z","lastTransitionTime":"2026-03-08T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.132746 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:38Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.228194 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.228228 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.228238 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.228254 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.228265 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:38Z","lastTransitionTime":"2026-03-08T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.330004 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.330048 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.330062 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.330078 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.330088 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:38Z","lastTransitionTime":"2026-03-08T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.433074 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.433120 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.433135 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.433157 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.433171 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:38Z","lastTransitionTime":"2026-03-08T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.536664 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.536721 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.536762 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.536786 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.536804 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:38Z","lastTransitionTime":"2026-03-08T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.540121 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:38 crc kubenswrapper[4713]: E0308 00:07:38.540284 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.640401 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.640451 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.640463 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.640482 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.640494 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:38Z","lastTransitionTime":"2026-03-08T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.743234 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.743455 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.743525 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.743588 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.743645 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:38Z","lastTransitionTime":"2026-03-08T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.846074 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.846135 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.846148 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.846167 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.846181 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:38Z","lastTransitionTime":"2026-03-08T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.948923 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.948987 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.948998 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.949017 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.949031 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:38Z","lastTransitionTime":"2026-03-08T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.964295 4713 generic.go:334] "Generic (PLEG): container finished" podID="d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205" containerID="7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625" exitCode=0 Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.964398 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" event={"ID":"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205","Type":"ContainerDied","Data":"7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625"} Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.966018 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b"} Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.972935 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerStarted","Data":"4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078"} Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.976518 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:38Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.994453 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:38Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.010317 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.022578 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.038559 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.057343 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.058618 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.058642 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.058649 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.058662 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.058691 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:39Z","lastTransitionTime":"2026-03-08T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.069735 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.081468 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.095394 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.108899 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.125491 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.137704 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.148612 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.156971 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.161123 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.161158 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.161171 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.161188 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.161200 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:39Z","lastTransitionTime":"2026-03-08T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.171894 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.182676 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.194373 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.204647 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.215144 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.225879 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.239292 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.254519 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.262951 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.262987 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.262997 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.263015 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.263025 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:39Z","lastTransitionTime":"2026-03-08T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.267505 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.278099 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.365745 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.365810 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.365900 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.365927 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.365947 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:39Z","lastTransitionTime":"2026-03-08T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.468594 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.468631 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.468639 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.468652 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.468664 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:39Z","lastTransitionTime":"2026-03-08T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.540717 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.540723 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:39 crc kubenswrapper[4713]: E0308 00:07:39.540876 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:39 crc kubenswrapper[4713]: E0308 00:07:39.540955 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.554182 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.570710 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.570747 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.570758 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.570772 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.570786 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:39Z","lastTransitionTime":"2026-03-08T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.673141 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.673189 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.673199 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.673213 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.673223 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:39Z","lastTransitionTime":"2026-03-08T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.774722 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.774759 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.774768 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.774783 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.774793 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:39Z","lastTransitionTime":"2026-03-08T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.877134 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.877277 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.877291 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.877308 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.877321 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:39Z","lastTransitionTime":"2026-03-08T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.977577 4713 generic.go:334] "Generic (PLEG): container finished" podID="d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205" containerID="7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee" exitCode=0 Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.977659 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" event={"ID":"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205","Type":"ContainerDied","Data":"7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee"} Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.978714 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.978757 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.978769 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.978783 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.978795 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:39Z","lastTransitionTime":"2026-03-08T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.000481 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.016770 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.030422 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.042369 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.052971 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.066613 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.082986 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.083030 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.083044 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.083061 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.083073 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:40Z","lastTransitionTime":"2026-03-08T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.087298 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.100219 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.116777 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.130042 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.142697 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.160959 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.176096 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.186076 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.186103 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.186112 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.186126 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.186135 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:40Z","lastTransitionTime":"2026-03-08T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.289151 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.289185 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.289195 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.289212 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.289222 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:40Z","lastTransitionTime":"2026-03-08T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.391195 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.391225 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.391235 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.391249 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.391260 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:40Z","lastTransitionTime":"2026-03-08T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.493852 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.493889 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.493898 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.493912 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.493922 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:40Z","lastTransitionTime":"2026-03-08T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.540988 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:40 crc kubenswrapper[4713]: E0308 00:07:40.541127 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.596612 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.596875 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.596968 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.597076 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.597152 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:40Z","lastTransitionTime":"2026-03-08T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.699552 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.699595 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.699610 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.699630 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.699644 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:40Z","lastTransitionTime":"2026-03-08T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.802009 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.802105 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.802129 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.802162 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.802183 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:40Z","lastTransitionTime":"2026-03-08T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.827976 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-d9bpk"] Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.828549 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-d9bpk" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.837456 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.837521 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.837645 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.838165 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.845092 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23406c9e-4ba0-4b59-a360-fb325a1adb0b-host\") pod \"node-ca-d9bpk\" (UID: \"23406c9e-4ba0-4b59-a360-fb325a1adb0b\") " pod="openshift-image-registry/node-ca-d9bpk" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.845144 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r7qj\" (UniqueName: \"kubernetes.io/projected/23406c9e-4ba0-4b59-a360-fb325a1adb0b-kube-api-access-5r7qj\") pod \"node-ca-d9bpk\" (UID: \"23406c9e-4ba0-4b59-a360-fb325a1adb0b\") " pod="openshift-image-registry/node-ca-d9bpk" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.845192 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/23406c9e-4ba0-4b59-a360-fb325a1adb0b-serviceca\") pod \"node-ca-d9bpk\" (UID: \"23406c9e-4ba0-4b59-a360-fb325a1adb0b\") " pod="openshift-image-registry/node-ca-d9bpk" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.853409 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.869666 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.882318 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.891605 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.904029 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.904416 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.904440 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.904450 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.904463 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.904473 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:40Z","lastTransitionTime":"2026-03-08T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.915480 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.926067 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.937199 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.945897 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r7qj\" (UniqueName: \"kubernetes.io/projected/23406c9e-4ba0-4b59-a360-fb325a1adb0b-kube-api-access-5r7qj\") pod \"node-ca-d9bpk\" (UID: \"23406c9e-4ba0-4b59-a360-fb325a1adb0b\") " pod="openshift-image-registry/node-ca-d9bpk" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.945949 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/23406c9e-4ba0-4b59-a360-fb325a1adb0b-serviceca\") pod \"node-ca-d9bpk\" (UID: \"23406c9e-4ba0-4b59-a360-fb325a1adb0b\") " pod="openshift-image-registry/node-ca-d9bpk" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.946019 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23406c9e-4ba0-4b59-a360-fb325a1adb0b-host\") pod \"node-ca-d9bpk\" (UID: \"23406c9e-4ba0-4b59-a360-fb325a1adb0b\") " pod="openshift-image-registry/node-ca-d9bpk" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.946191 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23406c9e-4ba0-4b59-a360-fb325a1adb0b-host\") pod \"node-ca-d9bpk\" (UID: \"23406c9e-4ba0-4b59-a360-fb325a1adb0b\") " pod="openshift-image-registry/node-ca-d9bpk" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.946995 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/23406c9e-4ba0-4b59-a360-fb325a1adb0b-serviceca\") pod \"node-ca-d9bpk\" (UID: \"23406c9e-4ba0-4b59-a360-fb325a1adb0b\") " pod="openshift-image-registry/node-ca-d9bpk" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.949498 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.962575 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.969456 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r7qj\" (UniqueName: \"kubernetes.io/projected/23406c9e-4ba0-4b59-a360-fb325a1adb0b-kube-api-access-5r7qj\") pod \"node-ca-d9bpk\" (UID: \"23406c9e-4ba0-4b59-a360-fb325a1adb0b\") " pod="openshift-image-registry/node-ca-d9bpk" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.976309 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.984384 4713 generic.go:334] "Generic (PLEG): container finished" podID="d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205" containerID="3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208" exitCode=0 Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.984456 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" event={"ID":"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205","Type":"ContainerDied","Data":"3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208"} Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.989344 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.008222 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.008297 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.008324 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.008352 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.008373 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:41Z","lastTransitionTime":"2026-03-08T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.009514 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.022993 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.037383 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.049188 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.059503 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.074502 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.087932 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.115969 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.116258 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.116312 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.116329 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.116352 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.116368 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:41Z","lastTransitionTime":"2026-03-08T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.144811 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.148274 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-d9bpk" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.164075 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.175752 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.188551 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.200733 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.213166 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.219150 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.219180 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.219191 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.219206 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.219219 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:41Z","lastTransitionTime":"2026-03-08T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.231894 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.243405 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.320682 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.320720 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.320732 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.320748 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.320760 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:41Z","lastTransitionTime":"2026-03-08T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.422598 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.422639 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.422647 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.422662 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.422672 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:41Z","lastTransitionTime":"2026-03-08T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.524276 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.524333 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.524389 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.524414 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.524430 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:41Z","lastTransitionTime":"2026-03-08T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.540686 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.540735 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:41 crc kubenswrapper[4713]: E0308 00:07:41.540808 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:41 crc kubenswrapper[4713]: E0308 00:07:41.540999 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.626852 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.626892 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.626903 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.626920 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.626932 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:41Z","lastTransitionTime":"2026-03-08T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.728720 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.728766 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.728778 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.728794 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.728805 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:41Z","lastTransitionTime":"2026-03-08T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.831916 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.831993 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.832026 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.832062 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.832083 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:41Z","lastTransitionTime":"2026-03-08T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.934049 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.934084 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.934095 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.934110 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.934122 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:41Z","lastTransitionTime":"2026-03-08T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.991846 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" event={"ID":"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205","Type":"ContainerStarted","Data":"03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed"} Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.994435 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-d9bpk" event={"ID":"23406c9e-4ba0-4b59-a360-fb325a1adb0b","Type":"ContainerStarted","Data":"0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352"} Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.994507 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-d9bpk" event={"ID":"23406c9e-4ba0-4b59-a360-fb325a1adb0b","Type":"ContainerStarted","Data":"d50b9cebf0a75336d3c988668e019e91bfc640c20e72aed0928a601696b242cd"} Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.000306 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerStarted","Data":"c6d9f665f4f27521614ebef412c48d8a6f29342a3069580cd12dda0a3ba9d254"} Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.000732 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.000787 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.000805 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.012671 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.029107 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.030255 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.037247 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.037301 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.037314 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.037331 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.037355 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:42Z","lastTransitionTime":"2026-03-08T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.037372 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.042531 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.055758 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.069345 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.083271 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.093269 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.123997 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.136414 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.139449 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.139470 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.139479 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.139492 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.139501 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:42Z","lastTransitionTime":"2026-03-08T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.151375 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.164879 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.182227 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.204259 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.219102 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.235446 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.242029 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.242081 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.242094 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.242111 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.242124 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:42Z","lastTransitionTime":"2026-03-08T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.260430 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6d9f665f4f27521614ebef412c48d8a6f29342a3069580cd12dda0a3ba9d254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.272329 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.286811 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.300371 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.319186 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.331991 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.345409 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.345484 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.345504 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.345532 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.345550 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:42Z","lastTransitionTime":"2026-03-08T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.352323 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.368578 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.386090 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.399240 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.422650 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.445169 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.448034 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.448080 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.448097 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.448120 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.448131 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:42Z","lastTransitionTime":"2026-03-08T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.459123 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.540675 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:42 crc kubenswrapper[4713]: E0308 00:07:42.540842 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.549977 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.550038 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.550051 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.550067 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.550078 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:42Z","lastTransitionTime":"2026-03-08T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.652652 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.652708 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.652719 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.652734 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.652744 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:42Z","lastTransitionTime":"2026-03-08T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.755169 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.755234 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.755251 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.755273 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.755289 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:42Z","lastTransitionTime":"2026-03-08T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.857591 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.857655 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.857669 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.857688 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.857701 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:42Z","lastTransitionTime":"2026-03-08T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.960319 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.960359 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.960367 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.960380 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.960388 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:42Z","lastTransitionTime":"2026-03-08T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.064418 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.064477 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.064494 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.064523 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.064541 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:43Z","lastTransitionTime":"2026-03-08T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.167654 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.167720 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.167764 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.167790 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.167809 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:43Z","lastTransitionTime":"2026-03-08T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.270377 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.270456 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.270482 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.270542 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.270589 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:43Z","lastTransitionTime":"2026-03-08T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.372196 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.372233 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.372242 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.372254 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.372263 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:43Z","lastTransitionTime":"2026-03-08T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.474075 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.474121 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.474130 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.474143 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.474154 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:43Z","lastTransitionTime":"2026-03-08T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.540536 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.540621 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:43 crc kubenswrapper[4713]: E0308 00:07:43.540647 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:43 crc kubenswrapper[4713]: E0308 00:07:43.540729 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.576468 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.576508 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.576521 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.576537 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.576548 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:43Z","lastTransitionTime":"2026-03-08T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.678626 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.678659 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.678668 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.678681 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.678690 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:43Z","lastTransitionTime":"2026-03-08T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.781062 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.781101 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.781110 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.781125 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.781136 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:43Z","lastTransitionTime":"2026-03-08T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.884030 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.884094 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.884112 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.884136 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.884153 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:43Z","lastTransitionTime":"2026-03-08T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.985292 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.985343 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.985363 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.985386 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.985402 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:43Z","lastTransitionTime":"2026-03-08T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:44 crc kubenswrapper[4713]: E0308 00:07:44.005987 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.008815 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovnkube-controller/0.log" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.011155 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.011222 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.011247 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.011281 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.011306 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:44Z","lastTransitionTime":"2026-03-08T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.014408 4713 generic.go:334] "Generic (PLEG): container finished" podID="56fbba07-87e8-4e77-b834-ed68af718d11" containerID="c6d9f665f4f27521614ebef412c48d8a6f29342a3069580cd12dda0a3ba9d254" exitCode=1 Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.014486 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerDied","Data":"c6d9f665f4f27521614ebef412c48d8a6f29342a3069580cd12dda0a3ba9d254"} Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.015699 4713 scope.go:117] "RemoveContainer" containerID="c6d9f665f4f27521614ebef412c48d8a6f29342a3069580cd12dda0a3ba9d254" Mar 08 00:07:44 crc kubenswrapper[4713]: E0308 00:07:44.029265 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.033992 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.034044 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.034061 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.034083 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.034101 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:44Z","lastTransitionTime":"2026-03-08T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.045501 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: E0308 00:07:44.057313 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.061695 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.061740 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.061753 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.061771 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.061783 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:44Z","lastTransitionTime":"2026-03-08T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.065649 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6d9f665f4f27521614ebef412c48d8a6f29342a3069580cd12dda0a3ba9d254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6d9f665f4f27521614ebef412c48d8a6f29342a3069580cd12dda0a3ba9d254\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:07:43Z\\\",\\\"message\\\":\\\"ler 8 for removal\\\\nI0308 00:07:43.841366 6520 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0308 00:07:43.841403 6520 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0308 00:07:43.841421 6520 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0308 00:07:43.841468 6520 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 00:07:43.841519 6520 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0308 00:07:43.841534 6520 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0308 00:07:43.841601 6520 factory.go:656] Stopping watch factory\\\\nI0308 00:07:43.841636 6520 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 00:07:43.841666 6520 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 00:07:43.841682 6520 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 00:07:43.841702 6520 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 00:07:43.841706 6520 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 00:07:43.841402 6520 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 00:07:43.841690 6520 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 00:07:43.841715 6520 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: E0308 00:07:44.075431 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.081089 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.081206 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.081221 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.081240 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.081252 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:44Z","lastTransitionTime":"2026-03-08T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.081232 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.096534 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: E0308 00:07:44.098810 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: E0308 00:07:44.099721 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.102125 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.102173 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.102185 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.102203 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.102219 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:44Z","lastTransitionTime":"2026-03-08T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.109680 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.122386 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.145292 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.159377 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.172209 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.195497 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.205058 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.205096 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.205108 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.205125 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.205137 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:44Z","lastTransitionTime":"2026-03-08T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.211031 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.224691 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.239027 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.251212 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.307515 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.307552 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.307561 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.307575 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.307585 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:44Z","lastTransitionTime":"2026-03-08T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.410228 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.410267 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.410277 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.410291 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.410300 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:44Z","lastTransitionTime":"2026-03-08T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.512368 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.512404 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.512414 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.512428 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.512437 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:44Z","lastTransitionTime":"2026-03-08T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.541706 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:44 crc kubenswrapper[4713]: E0308 00:07:44.541886 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.545601 4713 scope.go:117] "RemoveContainer" containerID="5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.614667 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.614707 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.614719 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.614737 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.614749 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:44Z","lastTransitionTime":"2026-03-08T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.716927 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.716961 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.716973 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.716990 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.717002 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:44Z","lastTransitionTime":"2026-03-08T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.819590 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.819653 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.819662 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.819677 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.819686 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:44Z","lastTransitionTime":"2026-03-08T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.922187 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.922239 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.922253 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.922295 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.922309 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:44Z","lastTransitionTime":"2026-03-08T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.019913 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovnkube-controller/0.log" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.023398 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerStarted","Data":"6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c"} Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.023796 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.024291 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.024313 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.024322 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.024332 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.024341 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:45Z","lastTransitionTime":"2026-03-08T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.025969 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.027864 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7"} Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.028093 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.039931 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.058451 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6d9f665f4f27521614ebef412c48d8a6f29342a3069580cd12dda0a3ba9d254\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:07:43Z\\\",\\\"message\\\":\\\"ler 8 for removal\\\\nI0308 00:07:43.841366 6520 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0308 00:07:43.841403 6520 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0308 00:07:43.841421 6520 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0308 00:07:43.841468 6520 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 00:07:43.841519 6520 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0308 00:07:43.841534 6520 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0308 00:07:43.841601 6520 factory.go:656] Stopping watch factory\\\\nI0308 00:07:43.841636 6520 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 00:07:43.841666 6520 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 00:07:43.841682 6520 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 00:07:43.841702 6520 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 00:07:43.841706 6520 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 00:07:43.841402 6520 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 00:07:43.841690 6520 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 00:07:43.841715 6520 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.073218 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.086925 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.096263 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.105140 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.120681 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.125761 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.125789 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.125797 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.125810 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.125819 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:45Z","lastTransitionTime":"2026-03-08T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.129304 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.148479 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.162959 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.177653 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.187702 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.197816 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.210200 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.223806 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.228044 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.228240 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.228255 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.228277 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.228293 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:45Z","lastTransitionTime":"2026-03-08T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.239883 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.254509 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.265419 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.276577 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.286553 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.306487 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.319602 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.338054 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.338085 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.338094 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.338107 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.338115 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:45Z","lastTransitionTime":"2026-03-08T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.344503 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6d9f665f4f27521614ebef412c48d8a6f29342a3069580cd12dda0a3ba9d254\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:07:43Z\\\",\\\"message\\\":\\\"ler 8 for removal\\\\nI0308 00:07:43.841366 6520 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0308 00:07:43.841403 6520 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0308 00:07:43.841421 6520 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0308 00:07:43.841468 6520 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 00:07:43.841519 6520 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0308 00:07:43.841534 6520 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0308 00:07:43.841601 6520 factory.go:656] Stopping watch factory\\\\nI0308 00:07:43.841636 6520 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 00:07:43.841666 6520 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 00:07:43.841682 6520 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 00:07:43.841702 6520 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 00:07:43.841706 6520 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 00:07:43.841402 6520 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 00:07:43.841690 6520 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 00:07:43.841715 6520 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.358305 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.372318 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.388555 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.400110 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.418400 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.440810 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.440862 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.440870 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.440885 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.440894 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:45Z","lastTransitionTime":"2026-03-08T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.540278 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:45 crc kubenswrapper[4713]: E0308 00:07:45.540594 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.540281 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:45 crc kubenswrapper[4713]: E0308 00:07:45.540692 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.542798 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.542830 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.542839 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.542857 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.542866 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:45Z","lastTransitionTime":"2026-03-08T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.644909 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.644966 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.644982 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.645010 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.645029 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:45Z","lastTransitionTime":"2026-03-08T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.747673 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.747714 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.747723 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.747753 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.747767 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:45Z","lastTransitionTime":"2026-03-08T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.850708 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.850733 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.850741 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.850754 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.850762 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:45Z","lastTransitionTime":"2026-03-08T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.953466 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.953531 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.953552 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.953579 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.953603 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:45Z","lastTransitionTime":"2026-03-08T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.032601 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovnkube-controller/1.log" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.033230 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovnkube-controller/0.log" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.035962 4713 generic.go:334] "Generic (PLEG): container finished" podID="56fbba07-87e8-4e77-b834-ed68af718d11" containerID="6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c" exitCode=1 Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.035999 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerDied","Data":"6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c"} Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.036060 4713 scope.go:117] "RemoveContainer" containerID="c6d9f665f4f27521614ebef412c48d8a6f29342a3069580cd12dda0a3ba9d254" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.036789 4713 scope.go:117] "RemoveContainer" containerID="6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c" Mar 08 00:07:46 crc kubenswrapper[4713]: E0308 00:07:46.037009 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.056988 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.057026 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.057038 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.057055 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.057067 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:46Z","lastTransitionTime":"2026-03-08T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.057262 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.066760 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.081995 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.095317 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.108675 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.126977 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.140125 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.150762 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.158995 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.159113 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.159177 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.159255 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.159323 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:46Z","lastTransitionTime":"2026-03-08T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.160117 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.180647 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.194667 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.214522 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6d9f665f4f27521614ebef412c48d8a6f29342a3069580cd12dda0a3ba9d254\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:07:43Z\\\",\\\"message\\\":\\\"ler 8 for removal\\\\nI0308 00:07:43.841366 6520 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0308 00:07:43.841403 6520 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0308 00:07:43.841421 6520 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0308 00:07:43.841468 6520 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 00:07:43.841519 6520 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0308 00:07:43.841534 6520 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0308 00:07:43.841601 6520 factory.go:656] Stopping watch factory\\\\nI0308 00:07:43.841636 6520 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 00:07:43.841666 6520 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 00:07:43.841682 6520 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 00:07:43.841702 6520 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 00:07:43.841706 6520 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 00:07:43.841402 6520 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 00:07:43.841690 6520 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 00:07:43.841715 6520 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:07:45Z\\\",\\\"message\\\":\\\"41Z]\\\\nI0308 00:07:45.134678 6677 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00756f9fb \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-controller,},ClusterIP:10.217.5.16,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFami\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.226138 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.239091 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.261735 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.261959 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.262075 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.262207 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.262349 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:46Z","lastTransitionTime":"2026-03-08T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.365088 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.365333 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.365427 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.365553 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.365632 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:46Z","lastTransitionTime":"2026-03-08T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.467330 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.467578 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.467649 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.467718 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.467782 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:46Z","lastTransitionTime":"2026-03-08T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.540072 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:46 crc kubenswrapper[4713]: E0308 00:07:46.540317 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:46 crc kubenswrapper[4713]: E0308 00:07:46.568305 4713 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.576390 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.596088 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.611078 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.634653 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: E0308 00:07:46.636565 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.678079 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.693096 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.704452 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.716513 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.726786 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.740359 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.754456 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r"] Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.755250 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.757026 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.757338 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.763927 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.777564 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.794875 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6d9f665f4f27521614ebef412c48d8a6f29342a3069580cd12dda0a3ba9d254\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:07:43Z\\\",\\\"message\\\":\\\"ler 8 for removal\\\\nI0308 00:07:43.841366 6520 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0308 00:07:43.841403 6520 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0308 00:07:43.841421 6520 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0308 00:07:43.841468 6520 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 00:07:43.841519 6520 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0308 00:07:43.841534 6520 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0308 00:07:43.841601 6520 factory.go:656] Stopping watch factory\\\\nI0308 00:07:43.841636 6520 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 00:07:43.841666 6520 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 00:07:43.841682 6520 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 00:07:43.841702 6520 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 00:07:43.841706 6520 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 00:07:43.841402 6520 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 00:07:43.841690 6520 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 00:07:43.841715 6520 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:07:45Z\\\",\\\"message\\\":\\\"41Z]\\\\nI0308 00:07:45.134678 6677 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00756f9fb \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-controller,},ClusterIP:10.217.5.16,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFami\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.800482 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f22c2d7-0e3d-4132-b548-87e98062c766-env-overrides\") pod \"ovnkube-control-plane-749d76644c-r2j6r\" (UID: \"2f22c2d7-0e3d-4132-b548-87e98062c766\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.800632 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f22c2d7-0e3d-4132-b548-87e98062c766-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-r2j6r\" (UID: \"2f22c2d7-0e3d-4132-b548-87e98062c766\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.800741 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9vn4\" (UniqueName: \"kubernetes.io/projected/2f22c2d7-0e3d-4132-b548-87e98062c766-kube-api-access-x9vn4\") pod \"ovnkube-control-plane-749d76644c-r2j6r\" (UID: \"2f22c2d7-0e3d-4132-b548-87e98062c766\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.800930 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f22c2d7-0e3d-4132-b548-87e98062c766-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-r2j6r\" (UID: \"2f22c2d7-0e3d-4132-b548-87e98062c766\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.808340 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.818486 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.831282 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.843799 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.856536 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.866999 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.876492 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.894031 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.901572 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f22c2d7-0e3d-4132-b548-87e98062c766-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-r2j6r\" (UID: \"2f22c2d7-0e3d-4132-b548-87e98062c766\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.901621 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9vn4\" (UniqueName: \"kubernetes.io/projected/2f22c2d7-0e3d-4132-b548-87e98062c766-kube-api-access-x9vn4\") pod \"ovnkube-control-plane-749d76644c-r2j6r\" (UID: \"2f22c2d7-0e3d-4132-b548-87e98062c766\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.901697 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f22c2d7-0e3d-4132-b548-87e98062c766-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-r2j6r\" (UID: \"2f22c2d7-0e3d-4132-b548-87e98062c766\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.901729 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f22c2d7-0e3d-4132-b548-87e98062c766-env-overrides\") pod \"ovnkube-control-plane-749d76644c-r2j6r\" (UID: \"2f22c2d7-0e3d-4132-b548-87e98062c766\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.902489 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f22c2d7-0e3d-4132-b548-87e98062c766-env-overrides\") pod \"ovnkube-control-plane-749d76644c-r2j6r\" (UID: \"2f22c2d7-0e3d-4132-b548-87e98062c766\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.902679 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f22c2d7-0e3d-4132-b548-87e98062c766-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-r2j6r\" (UID: \"2f22c2d7-0e3d-4132-b548-87e98062c766\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.910044 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f22c2d7-0e3d-4132-b548-87e98062c766-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-r2j6r\" (UID: \"2f22c2d7-0e3d-4132-b548-87e98062c766\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.914455 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.917721 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9vn4\" (UniqueName: \"kubernetes.io/projected/2f22c2d7-0e3d-4132-b548-87e98062c766-kube-api-access-x9vn4\") pod \"ovnkube-control-plane-749d76644c-r2j6r\" (UID: \"2f22c2d7-0e3d-4132-b548-87e98062c766\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.927216 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.944856 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.961439 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.973413 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.985714 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.003118 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6d9f665f4f27521614ebef412c48d8a6f29342a3069580cd12dda0a3ba9d254\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:07:43Z\\\",\\\"message\\\":\\\"ler 8 for removal\\\\nI0308 00:07:43.841366 6520 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0308 00:07:43.841403 6520 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0308 00:07:43.841421 6520 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0308 00:07:43.841468 6520 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 00:07:43.841519 6520 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0308 00:07:43.841534 6520 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0308 00:07:43.841601 6520 factory.go:656] Stopping watch factory\\\\nI0308 00:07:43.841636 6520 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 00:07:43.841666 6520 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 00:07:43.841682 6520 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 00:07:43.841702 6520 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 00:07:43.841706 6520 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 00:07:43.841402 6520 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 00:07:43.841690 6520 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 00:07:43.841715 6520 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:07:45Z\\\",\\\"message\\\":\\\"41Z]\\\\nI0308 00:07:45.134678 6677 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00756f9fb \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-controller,},ClusterIP:10.217.5.16,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFami\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.021588 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.042252 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovnkube-controller/1.log" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.047140 4713 scope.go:117] "RemoveContainer" containerID="6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c" Mar 08 00:07:47 crc kubenswrapper[4713]: E0308 00:07:47.047405 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.067141 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.068579 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" Mar 08 00:07:47 crc kubenswrapper[4713]: W0308 00:07:47.089888 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f22c2d7_0e3d_4132_b548_87e98062c766.slice/crio-654a257cd2697566e8ce6feadc8783519d6605552a2ca92a65bdf57e3a1b080c WatchSource:0}: Error finding container 654a257cd2697566e8ce6feadc8783519d6605552a2ca92a65bdf57e3a1b080c: Status 404 returned error can't find the container with id 654a257cd2697566e8ce6feadc8783519d6605552a2ca92a65bdf57e3a1b080c Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.092589 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.113631 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.128924 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.150365 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.169537 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.180147 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.198431 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.210945 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.225102 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.238081 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.259031 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.272020 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.284618 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.304417 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:07:45Z\\\",\\\"message\\\":\\\"41Z]\\\\nI0308 00:07:45.134678 6677 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00756f9fb \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-controller,},ClusterIP:10.217.5.16,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFami\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.481284 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-9klvz"] Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.481798 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:07:47 crc kubenswrapper[4713]: E0308 00:07:47.481922 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.494619 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.507872 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp2sp\" (UniqueName: \"kubernetes.io/projected/02de296b-0485-4f21-abf9-51043545b565-kube-api-access-lp2sp\") pod \"network-metrics-daemon-9klvz\" (UID: \"02de296b-0485-4f21-abf9-51043545b565\") " pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.507938 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs\") pod \"network-metrics-daemon-9klvz\" (UID: \"02de296b-0485-4f21-abf9-51043545b565\") " pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.511896 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9klvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02de296b-0485-4f21-abf9-51043545b565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9klvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.524869 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.540339 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.540436 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:47 crc kubenswrapper[4713]: E0308 00:07:47.540486 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:47 crc kubenswrapper[4713]: E0308 00:07:47.540557 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.541324 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.551863 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.560129 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.569427 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.580248 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.589879 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.597773 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.605871 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.608589 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp2sp\" (UniqueName: \"kubernetes.io/projected/02de296b-0485-4f21-abf9-51043545b565-kube-api-access-lp2sp\") pod \"network-metrics-daemon-9klvz\" (UID: \"02de296b-0485-4f21-abf9-51043545b565\") " pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.608646 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs\") pod \"network-metrics-daemon-9klvz\" (UID: \"02de296b-0485-4f21-abf9-51043545b565\") " pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:07:47 crc kubenswrapper[4713]: E0308 00:07:47.608762 4713 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:07:47 crc kubenswrapper[4713]: E0308 00:07:47.608814 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs podName:02de296b-0485-4f21-abf9-51043545b565 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:48.108799995 +0000 UTC m=+122.228432228 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs") pod "network-metrics-daemon-9klvz" (UID: "02de296b-0485-4f21-abf9-51043545b565") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.624385 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.625657 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp2sp\" (UniqueName: \"kubernetes.io/projected/02de296b-0485-4f21-abf9-51043545b565-kube-api-access-lp2sp\") pod \"network-metrics-daemon-9klvz\" (UID: \"02de296b-0485-4f21-abf9-51043545b565\") " pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.636030 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.645629 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.655763 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.671873 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:07:45Z\\\",\\\"message\\\":\\\"41Z]\\\\nI0308 00:07:45.134678 6677 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00756f9fb \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-controller,},ClusterIP:10.217.5.16,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFami\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.050907 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" event={"ID":"2f22c2d7-0e3d-4132-b548-87e98062c766","Type":"ContainerStarted","Data":"486f1bf6be2e719226620d95e54e8e22a36b59998eb9cac6154f86fc5675234c"} Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.051002 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" event={"ID":"2f22c2d7-0e3d-4132-b548-87e98062c766","Type":"ContainerStarted","Data":"98f9429f468fa364a9888992c1fc62dff1b17294ce018fee40d6bc63ebee8c12"} Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.051031 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" event={"ID":"2f22c2d7-0e3d-4132-b548-87e98062c766","Type":"ContainerStarted","Data":"654a257cd2697566e8ce6feadc8783519d6605552a2ca92a65bdf57e3a1b080c"} Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.071872 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:48Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.093541 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:48Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.107475 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:48Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.114093 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs\") pod \"network-metrics-daemon-9klvz\" (UID: \"02de296b-0485-4f21-abf9-51043545b565\") " pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:07:48 crc kubenswrapper[4713]: E0308 00:07:48.114373 4713 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:07:48 crc kubenswrapper[4713]: E0308 00:07:48.114491 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs podName:02de296b-0485-4f21-abf9-51043545b565 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:49.114458266 +0000 UTC m=+123.234090559 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs") pod "network-metrics-daemon-9klvz" (UID: "02de296b-0485-4f21-abf9-51043545b565") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.128962 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:48Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.150018 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:48Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.164798 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:48Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.178513 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:48Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.196624 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:48Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.220145 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:48Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.239617 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:48Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.274245 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:48Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.291233 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f9429f468fa364a9888992c1fc62dff1b17294ce018fee40d6bc63ebee8c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486f1bf6be2e719226620d95e54e8e22a36b59998eb9cac6154f86fc5675234c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:48Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.313806 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:48Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.336091 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:07:45Z\\\",\\\"message\\\":\\\"41Z]\\\\nI0308 00:07:45.134678 6677 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00756f9fb \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-controller,},ClusterIP:10.217.5.16,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFami\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:48Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.352346 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9klvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02de296b-0485-4f21-abf9-51043545b565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9klvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:48Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.372364 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:48Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.540749 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:48 crc kubenswrapper[4713]: E0308 00:07:48.540910 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:49 crc kubenswrapper[4713]: I0308 00:07:49.123966 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs\") pod \"network-metrics-daemon-9klvz\" (UID: \"02de296b-0485-4f21-abf9-51043545b565\") " pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:07:49 crc kubenswrapper[4713]: E0308 00:07:49.124454 4713 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:07:49 crc kubenswrapper[4713]: E0308 00:07:49.124526 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs podName:02de296b-0485-4f21-abf9-51043545b565 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:51.124503456 +0000 UTC m=+125.244135719 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs") pod "network-metrics-daemon-9klvz" (UID: "02de296b-0485-4f21-abf9-51043545b565") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:07:49 crc kubenswrapper[4713]: I0308 00:07:49.540520 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:07:49 crc kubenswrapper[4713]: I0308 00:07:49.540573 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:49 crc kubenswrapper[4713]: I0308 00:07:49.540530 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:49 crc kubenswrapper[4713]: E0308 00:07:49.540720 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:07:49 crc kubenswrapper[4713]: E0308 00:07:49.540790 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:49 crc kubenswrapper[4713]: E0308 00:07:49.540873 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:50 crc kubenswrapper[4713]: I0308 00:07:50.540706 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:50 crc kubenswrapper[4713]: E0308 00:07:50.541507 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:51 crc kubenswrapper[4713]: I0308 00:07:51.144361 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs\") pod \"network-metrics-daemon-9klvz\" (UID: \"02de296b-0485-4f21-abf9-51043545b565\") " pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.144539 4713 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.144638 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs podName:02de296b-0485-4f21-abf9-51043545b565 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:55.144611726 +0000 UTC m=+129.264243999 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs") pod "network-metrics-daemon-9klvz" (UID: "02de296b-0485-4f21-abf9-51043545b565") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:07:51 crc kubenswrapper[4713]: I0308 00:07:51.346363 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.346484 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:08:23.346459146 +0000 UTC m=+157.466091389 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:07:51 crc kubenswrapper[4713]: I0308 00:07:51.448255 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:51 crc kubenswrapper[4713]: I0308 00:07:51.448390 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.448513 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.448556 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.448576 4713 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.448614 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.448630 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.448666 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.448681 4713 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.448685 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:08:23.448659993 +0000 UTC m=+157.568292256 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.448719 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 00:08:23.448703394 +0000 UTC m=+157.568335667 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:51 crc kubenswrapper[4713]: I0308 00:07:51.448518 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.448767 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 00:08:23.448733155 +0000 UTC m=+157.568365418 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:51 crc kubenswrapper[4713]: I0308 00:07:51.448796 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.448919 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.448981 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:08:23.448967811 +0000 UTC m=+157.568600084 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:07:51 crc kubenswrapper[4713]: I0308 00:07:51.539816 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:07:51 crc kubenswrapper[4713]: I0308 00:07:51.539906 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.539973 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:07:51 crc kubenswrapper[4713]: I0308 00:07:51.539982 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.540096 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.540304 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.637572 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:07:52 crc kubenswrapper[4713]: I0308 00:07:52.540335 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:52 crc kubenswrapper[4713]: E0308 00:07:52.540499 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:53 crc kubenswrapper[4713]: I0308 00:07:53.540394 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:53 crc kubenswrapper[4713]: I0308 00:07:53.540430 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:53 crc kubenswrapper[4713]: I0308 00:07:53.540497 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:07:53 crc kubenswrapper[4713]: E0308 00:07:53.540543 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:53 crc kubenswrapper[4713]: E0308 00:07:53.540736 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:53 crc kubenswrapper[4713]: E0308 00:07:53.540810 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.211649 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.211711 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.211729 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.211751 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.211769 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:54Z","lastTransitionTime":"2026-03-08T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:54 crc kubenswrapper[4713]: E0308 00:07:54.230804 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.235182 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.235221 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.235231 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.235245 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.235255 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:54Z","lastTransitionTime":"2026-03-08T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:54 crc kubenswrapper[4713]: E0308 00:07:54.254364 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.257963 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.257999 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.258011 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.258029 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.258037 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:54Z","lastTransitionTime":"2026-03-08T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:54 crc kubenswrapper[4713]: E0308 00:07:54.271319 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.275387 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.275420 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.275428 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.275441 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.275450 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:54Z","lastTransitionTime":"2026-03-08T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:54 crc kubenswrapper[4713]: E0308 00:07:54.294417 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.298591 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.298639 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.298658 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.298681 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.298697 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:54Z","lastTransitionTime":"2026-03-08T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:54 crc kubenswrapper[4713]: E0308 00:07:54.312676 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:54 crc kubenswrapper[4713]: E0308 00:07:54.312946 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.540333 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:54 crc kubenswrapper[4713]: E0308 00:07:54.540470 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:55 crc kubenswrapper[4713]: I0308 00:07:55.184027 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs\") pod \"network-metrics-daemon-9klvz\" (UID: \"02de296b-0485-4f21-abf9-51043545b565\") " pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:07:55 crc kubenswrapper[4713]: E0308 00:07:55.184224 4713 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:07:55 crc kubenswrapper[4713]: E0308 00:07:55.184297 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs podName:02de296b-0485-4f21-abf9-51043545b565 nodeName:}" failed. No retries permitted until 2026-03-08 00:08:03.184278713 +0000 UTC m=+137.303910956 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs") pod "network-metrics-daemon-9klvz" (UID: "02de296b-0485-4f21-abf9-51043545b565") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:07:55 crc kubenswrapper[4713]: I0308 00:07:55.540534 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:55 crc kubenswrapper[4713]: E0308 00:07:55.540667 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:55 crc kubenswrapper[4713]: I0308 00:07:55.540566 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:07:55 crc kubenswrapper[4713]: I0308 00:07:55.540552 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:55 crc kubenswrapper[4713]: E0308 00:07:55.540732 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:07:55 crc kubenswrapper[4713]: E0308 00:07:55.540859 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.540375 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:56 crc kubenswrapper[4713]: E0308 00:07:56.540503 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.559566 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.577519 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.590076 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.603787 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.623721 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.636137 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:56 crc kubenswrapper[4713]: E0308 00:07:56.639137 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.647873 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.664266 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.675922 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.686182 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.697195 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.706166 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f9429f468fa364a9888992c1fc62dff1b17294ce018fee40d6bc63ebee8c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486f1bf6be2e719226620d95e54e8e22a36b59998eb9cac6154f86fc5675234c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.719308 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.737760 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:07:45Z\\\",\\\"message\\\":\\\"41Z]\\\\nI0308 00:07:45.134678 6677 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00756f9fb \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-controller,},ClusterIP:10.217.5.16,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFami\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.750008 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.761363 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9klvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02de296b-0485-4f21-abf9-51043545b565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9klvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:57 crc kubenswrapper[4713]: I0308 00:07:57.540229 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:57 crc kubenswrapper[4713]: E0308 00:07:57.540404 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:57 crc kubenswrapper[4713]: I0308 00:07:57.540489 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:07:57 crc kubenswrapper[4713]: E0308 00:07:57.540548 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:07:57 crc kubenswrapper[4713]: I0308 00:07:57.540593 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:57 crc kubenswrapper[4713]: E0308 00:07:57.540658 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:58 crc kubenswrapper[4713]: I0308 00:07:58.540373 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:58 crc kubenswrapper[4713]: E0308 00:07:58.540586 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:59 crc kubenswrapper[4713]: I0308 00:07:59.540975 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:59 crc kubenswrapper[4713]: I0308 00:07:59.541034 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:07:59 crc kubenswrapper[4713]: I0308 00:07:59.541005 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:59 crc kubenswrapper[4713]: E0308 00:07:59.541217 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:59 crc kubenswrapper[4713]: E0308 00:07:59.541323 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:59 crc kubenswrapper[4713]: E0308 00:07:59.541417 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:00 crc kubenswrapper[4713]: I0308 00:08:00.540681 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:00 crc kubenswrapper[4713]: E0308 00:08:00.540961 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:01 crc kubenswrapper[4713]: I0308 00:08:01.539894 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:01 crc kubenswrapper[4713]: I0308 00:08:01.539932 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:01 crc kubenswrapper[4713]: I0308 00:08:01.539966 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:01 crc kubenswrapper[4713]: E0308 00:08:01.540026 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:01 crc kubenswrapper[4713]: E0308 00:08:01.540105 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:01 crc kubenswrapper[4713]: E0308 00:08:01.540203 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:01 crc kubenswrapper[4713]: E0308 00:08:01.640762 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:08:02 crc kubenswrapper[4713]: I0308 00:08:02.541154 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:02 crc kubenswrapper[4713]: E0308 00:08:02.541559 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:02 crc kubenswrapper[4713]: I0308 00:08:02.542010 4713 scope.go:117] "RemoveContainer" containerID="6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.112917 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovnkube-controller/1.log" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.119221 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerStarted","Data":"5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406"} Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.119804 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.138665 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.148426 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9klvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02de296b-0485-4f21-abf9-51043545b565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9klvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.159143 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.171437 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.182035 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.196649 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.207552 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs\") pod \"network-metrics-daemon-9klvz\" (UID: \"02de296b-0485-4f21-abf9-51043545b565\") " pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:03 crc kubenswrapper[4713]: E0308 00:08:03.207868 4713 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:08:03 crc kubenswrapper[4713]: E0308 00:08:03.208007 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs podName:02de296b-0485-4f21-abf9-51043545b565 nodeName:}" failed. No retries permitted until 2026-03-08 00:08:19.207976373 +0000 UTC m=+153.327608616 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs") pod "network-metrics-daemon-9klvz" (UID: "02de296b-0485-4f21-abf9-51043545b565") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.208614 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.223268 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.239037 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.253039 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.267983 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.287028 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.307811 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.318097 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f9429f468fa364a9888992c1fc62dff1b17294ce018fee40d6bc63ebee8c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486f1bf6be2e719226620d95e54e8e22a36b59998eb9cac6154f86fc5675234c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.333410 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.351190 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:07:45Z\\\",\\\"message\\\":\\\"41Z]\\\\nI0308 00:07:45.134678 6677 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00756f9fb \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-controller,},ClusterIP:10.217.5.16,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFami\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.540389 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.540426 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.540461 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:03 crc kubenswrapper[4713]: E0308 00:08:03.540523 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:03 crc kubenswrapper[4713]: E0308 00:08:03.540630 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:03 crc kubenswrapper[4713]: E0308 00:08:03.540748 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.555312 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.568813 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.579314 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9klvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02de296b-0485-4f21-abf9-51043545b565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9klvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.588800 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.598331 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.605633 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.618889 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.629025 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.644680 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.657189 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.666517 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.715668 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.730696 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.741278 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.750767 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f9429f468fa364a9888992c1fc62dff1b17294ce018fee40d6bc63ebee8c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486f1bf6be2e719226620d95e54e8e22a36b59998eb9cac6154f86fc5675234c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.760669 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.775561 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:07:45Z\\\",\\\"message\\\":\\\"41Z]\\\\nI0308 00:07:45.134678 6677 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00756f9fb \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-controller,},ClusterIP:10.217.5.16,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFami\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.123460 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovnkube-controller/2.log" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.124103 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovnkube-controller/1.log" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.126176 4713 generic.go:334] "Generic (PLEG): container finished" podID="56fbba07-87e8-4e77-b834-ed68af718d11" containerID="5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406" exitCode=1 Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.126211 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerDied","Data":"5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406"} Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.126275 4713 scope.go:117] "RemoveContainer" containerID="6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.127005 4713 scope.go:117] "RemoveContainer" containerID="5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406" Mar 08 00:08:04 crc kubenswrapper[4713]: E0308 00:08:04.127192 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.148754 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.163620 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.175920 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.186635 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.197810 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.215533 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.230043 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.239789 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f9429f468fa364a9888992c1fc62dff1b17294ce018fee40d6bc63ebee8c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486f1bf6be2e719226620d95e54e8e22a36b59998eb9cac6154f86fc5675234c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.257063 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:07:45Z\\\",\\\"message\\\":\\\"41Z]\\\\nI0308 00:07:45.134678 6677 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00756f9fb \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-controller,},ClusterIP:10.217.5.16,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFami\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"message\\\":\\\"led to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z]\\\\nI0308 00:08:03.483432 7001 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 00:08:03.483433 7001 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-fp2h2\\\\nI0308 00:08:03.483436 7001 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 00:08:03.483440 7001 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 00:08:03.483444 7001 obj_retry.go:303] Retry \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.269716 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.281478 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.291108 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9klvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02de296b-0485-4f21-abf9-51043545b565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9klvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.300719 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.308859 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.320145 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.329744 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.540495 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:04 crc kubenswrapper[4713]: E0308 00:08:04.541025 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.555130 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.569901 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.569975 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.569985 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.569999 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.570010 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:04Z","lastTransitionTime":"2026-03-08T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:04 crc kubenswrapper[4713]: E0308 00:08:04.583189 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.585889 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.585915 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.585927 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.585943 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.585952 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:04Z","lastTransitionTime":"2026-03-08T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:04 crc kubenswrapper[4713]: E0308 00:08:04.595959 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.599197 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.599242 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.599253 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.599270 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.599282 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:04Z","lastTransitionTime":"2026-03-08T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:04 crc kubenswrapper[4713]: E0308 00:08:04.610022 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.613113 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.613134 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.613142 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.613153 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.613161 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:04Z","lastTransitionTime":"2026-03-08T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:04 crc kubenswrapper[4713]: E0308 00:08:04.624684 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.627864 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.627899 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.627908 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.627926 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.627934 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:04Z","lastTransitionTime":"2026-03-08T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:04 crc kubenswrapper[4713]: E0308 00:08:04.638250 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: E0308 00:08:04.638362 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.134629 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovnkube-controller/2.log" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.138602 4713 scope.go:117] "RemoveContainer" containerID="5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406" Mar 08 00:08:05 crc kubenswrapper[4713]: E0308 00:08:05.138759 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.152113 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.171749 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"message\\\":\\\"led to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z]\\\\nI0308 00:08:03.483432 7001 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 00:08:03.483433 7001 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-fp2h2\\\\nI0308 00:08:03.483436 7001 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 00:08:03.483440 7001 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 00:08:03.483444 7001 obj_retry.go:303] Retry \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:08:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.188985 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.199115 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9klvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02de296b-0485-4f21-abf9-51043545b565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9klvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.211342 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18d4c436-d96e-4238-a331-e31bbba3ef13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4f2e2a2032fc81a42fc85a39850f466a62c05bac6854649c6f1cf4cd351d20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3be2a9168107359e36f3374d00388edf302f4f04e75b6341365adc72fa8fc5e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd7083511dc3876b161d2a5d4bdb150add9f6dac94659eb413736834dbdf0e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.226707 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.239428 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.250764 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.267705 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.281111 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.302348 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.315685 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.327364 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.340953 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.356459 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.370272 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.382893 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f9429f468fa364a9888992c1fc62dff1b17294ce018fee40d6bc63ebee8c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486f1bf6be2e719226620d95e54e8e22a36b59998eb9cac6154f86fc5675234c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.540302 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.540436 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:05 crc kubenswrapper[4713]: E0308 00:08:05.540472 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:05 crc kubenswrapper[4713]: E0308 00:08:05.540562 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.540728 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:05 crc kubenswrapper[4713]: E0308 00:08:05.540883 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.540303 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:06 crc kubenswrapper[4713]: E0308 00:08:06.540660 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.552522 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.562064 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.575958 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.587435 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18d4c436-d96e-4238-a331-e31bbba3ef13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4f2e2a2032fc81a42fc85a39850f466a62c05bac6854649c6f1cf4cd351d20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3be2a9168107359e36f3374d00388edf302f4f04e75b6341365adc72fa8fc5e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd7083511dc3876b161d2a5d4bdb150add9f6dac94659eb413736834dbdf0e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.598041 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.609481 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.621022 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.631225 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.640231 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:06 crc kubenswrapper[4713]: E0308 00:08:06.641666 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.649085 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.675304 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.687556 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.697908 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f9429f468fa364a9888992c1fc62dff1b17294ce018fee40d6bc63ebee8c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486f1bf6be2e719226620d95e54e8e22a36b59998eb9cac6154f86fc5675234c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.715067 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"message\\\":\\\"led to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z]\\\\nI0308 00:08:03.483432 7001 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 00:08:03.483433 7001 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-fp2h2\\\\nI0308 00:08:03.483436 7001 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 00:08:03.483440 7001 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 00:08:03.483444 7001 obj_retry.go:303] Retry \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:08:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.727111 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.739367 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.749585 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9klvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02de296b-0485-4f21-abf9-51043545b565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9klvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:07 crc kubenswrapper[4713]: I0308 00:08:07.541998 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:07 crc kubenswrapper[4713]: E0308 00:08:07.542468 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:07 crc kubenswrapper[4713]: I0308 00:08:07.542164 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:07 crc kubenswrapper[4713]: I0308 00:08:07.542276 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:07 crc kubenswrapper[4713]: E0308 00:08:07.542645 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:07 crc kubenswrapper[4713]: E0308 00:08:07.542705 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:08 crc kubenswrapper[4713]: I0308 00:08:08.540164 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:08 crc kubenswrapper[4713]: E0308 00:08:08.540384 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:09 crc kubenswrapper[4713]: I0308 00:08:09.540570 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:09 crc kubenswrapper[4713]: I0308 00:08:09.540712 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:09 crc kubenswrapper[4713]: E0308 00:08:09.540761 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:09 crc kubenswrapper[4713]: I0308 00:08:09.540575 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:09 crc kubenswrapper[4713]: E0308 00:08:09.541002 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:09 crc kubenswrapper[4713]: E0308 00:08:09.541090 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:10 crc kubenswrapper[4713]: I0308 00:08:10.540310 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:10 crc kubenswrapper[4713]: E0308 00:08:10.540537 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:11 crc kubenswrapper[4713]: I0308 00:08:11.540655 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:11 crc kubenswrapper[4713]: I0308 00:08:11.540725 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:11 crc kubenswrapper[4713]: E0308 00:08:11.540813 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:11 crc kubenswrapper[4713]: I0308 00:08:11.540933 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:11 crc kubenswrapper[4713]: E0308 00:08:11.541133 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:11 crc kubenswrapper[4713]: E0308 00:08:11.541163 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:11 crc kubenswrapper[4713]: E0308 00:08:11.643268 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:08:12 crc kubenswrapper[4713]: I0308 00:08:12.540764 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:12 crc kubenswrapper[4713]: E0308 00:08:12.541084 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:13 crc kubenswrapper[4713]: I0308 00:08:13.540721 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:13 crc kubenswrapper[4713]: I0308 00:08:13.540775 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:13 crc kubenswrapper[4713]: E0308 00:08:13.540871 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:13 crc kubenswrapper[4713]: I0308 00:08:13.540799 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:13 crc kubenswrapper[4713]: E0308 00:08:13.540970 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:13 crc kubenswrapper[4713]: E0308 00:08:13.541026 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:14 crc kubenswrapper[4713]: I0308 00:08:14.540132 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:14 crc kubenswrapper[4713]: E0308 00:08:14.540274 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.010578 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.011213 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.011369 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.011625 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.011640 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:15Z","lastTransitionTime":"2026-03-08T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:15 crc kubenswrapper[4713]: E0308 00:08:15.033042 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:15Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.036728 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.036761 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.036770 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.036784 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.036795 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:15Z","lastTransitionTime":"2026-03-08T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:15 crc kubenswrapper[4713]: E0308 00:08:15.048729 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:15Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.052197 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.052335 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.052407 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.052491 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.052588 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:15Z","lastTransitionTime":"2026-03-08T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:15 crc kubenswrapper[4713]: E0308 00:08:15.065508 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:15Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.068885 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.069030 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.069105 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.069181 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.069244 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:15Z","lastTransitionTime":"2026-03-08T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:15 crc kubenswrapper[4713]: E0308 00:08:15.082878 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:15Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.087171 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.087198 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.087208 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.087223 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.087234 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:15Z","lastTransitionTime":"2026-03-08T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:15 crc kubenswrapper[4713]: E0308 00:08:15.102093 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:15Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:15 crc kubenswrapper[4713]: E0308 00:08:15.102210 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.540320 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.540436 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:15 crc kubenswrapper[4713]: E0308 00:08:15.540654 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:15 crc kubenswrapper[4713]: E0308 00:08:15.540928 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.540972 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:15 crc kubenswrapper[4713]: E0308 00:08:15.541176 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.540133 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:16 crc kubenswrapper[4713]: E0308 00:08:16.540320 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.561919 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.593308 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"message\\\":\\\"led to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z]\\\\nI0308 00:08:03.483432 7001 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 00:08:03.483433 7001 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-fp2h2\\\\nI0308 00:08:03.483436 7001 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 00:08:03.483440 7001 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 00:08:03.483444 7001 obj_retry.go:303] Retry \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:08:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.612239 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.626803 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9klvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02de296b-0485-4f21-abf9-51043545b565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9klvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.641711 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18d4c436-d96e-4238-a331-e31bbba3ef13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4f2e2a2032fc81a42fc85a39850f466a62c05bac6854649c6f1cf4cd351d20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3be2a9168107359e36f3374d00388edf302f4f04e75b6341365adc72fa8fc5e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd7083511dc3876b161d2a5d4bdb150add9f6dac94659eb413736834dbdf0e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:16 crc kubenswrapper[4713]: E0308 00:08:16.644417 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.656569 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.671358 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.681309 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.703560 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.716743 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.728543 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.738874 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.763984 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.785488 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.796004 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.806976 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.817669 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f9429f468fa364a9888992c1fc62dff1b17294ce018fee40d6bc63ebee8c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486f1bf6be2e719226620d95e54e8e22a36b59998eb9cac6154f86fc5675234c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:17 crc kubenswrapper[4713]: I0308 00:08:17.540493 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:17 crc kubenswrapper[4713]: E0308 00:08:17.540686 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:17 crc kubenswrapper[4713]: I0308 00:08:17.540722 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:17 crc kubenswrapper[4713]: I0308 00:08:17.540802 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:17 crc kubenswrapper[4713]: E0308 00:08:17.541378 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:17 crc kubenswrapper[4713]: E0308 00:08:17.541466 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:17 crc kubenswrapper[4713]: I0308 00:08:17.542024 4713 scope.go:117] "RemoveContainer" containerID="5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406" Mar 08 00:08:17 crc kubenswrapper[4713]: E0308 00:08:17.542291 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" Mar 08 00:08:18 crc kubenswrapper[4713]: I0308 00:08:18.540398 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:18 crc kubenswrapper[4713]: E0308 00:08:18.540533 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:19 crc kubenswrapper[4713]: I0308 00:08:19.269806 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs\") pod \"network-metrics-daemon-9klvz\" (UID: \"02de296b-0485-4f21-abf9-51043545b565\") " pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:19 crc kubenswrapper[4713]: E0308 00:08:19.269993 4713 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:08:19 crc kubenswrapper[4713]: E0308 00:08:19.270076 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs podName:02de296b-0485-4f21-abf9-51043545b565 nodeName:}" failed. No retries permitted until 2026-03-08 00:08:51.270059668 +0000 UTC m=+185.389691901 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs") pod "network-metrics-daemon-9klvz" (UID: "02de296b-0485-4f21-abf9-51043545b565") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:08:19 crc kubenswrapper[4713]: I0308 00:08:19.540522 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:19 crc kubenswrapper[4713]: I0308 00:08:19.540561 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:19 crc kubenswrapper[4713]: I0308 00:08:19.540522 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:19 crc kubenswrapper[4713]: E0308 00:08:19.540642 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:19 crc kubenswrapper[4713]: E0308 00:08:19.540884 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:19 crc kubenswrapper[4713]: E0308 00:08:19.540961 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:20 crc kubenswrapper[4713]: I0308 00:08:20.540517 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:20 crc kubenswrapper[4713]: E0308 00:08:20.540677 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:21 crc kubenswrapper[4713]: I0308 00:08:21.540408 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:21 crc kubenswrapper[4713]: I0308 00:08:21.540467 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:21 crc kubenswrapper[4713]: E0308 00:08:21.540548 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:21 crc kubenswrapper[4713]: I0308 00:08:21.540435 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:21 crc kubenswrapper[4713]: E0308 00:08:21.540771 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:21 crc kubenswrapper[4713]: E0308 00:08:21.540882 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:21 crc kubenswrapper[4713]: E0308 00:08:21.646181 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.198182 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fh96f_bf95e3f7-808b-434f-8fd4-c7e7365a1561/kube-multus/0.log" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.198276 4713 generic.go:334] "Generic (PLEG): container finished" podID="bf95e3f7-808b-434f-8fd4-c7e7365a1561" containerID="f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2" exitCode=1 Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.198324 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fh96f" event={"ID":"bf95e3f7-808b-434f-8fd4-c7e7365a1561","Type":"ContainerDied","Data":"f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2"} Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.199016 4713 scope.go:117] "RemoveContainer" containerID="f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.214226 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.247639 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.266891 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.281194 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.292221 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.306234 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.318056 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.332262 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f9429f468fa364a9888992c1fc62dff1b17294ce018fee40d6bc63ebee8c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486f1bf6be2e719226620d95e54e8e22a36b59998eb9cac6154f86fc5675234c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.344775 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.362463 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"message\\\":\\\"led to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z]\\\\nI0308 00:08:03.483432 7001 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 00:08:03.483433 7001 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-fp2h2\\\\nI0308 00:08:03.483436 7001 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 00:08:03.483440 7001 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 00:08:03.483444 7001 obj_retry.go:303] Retry \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:08:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.375940 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:21Z\\\",\\\"message\\\":\\\"2026-03-08T00:07:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3990e67c-099f-4787-bb76-e8e8b28a5f14\\\\n2026-03-08T00:07:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3990e67c-099f-4787-bb76-e8e8b28a5f14 to /host/opt/cni/bin/\\\\n2026-03-08T00:07:36Z [verbose] multus-daemon started\\\\n2026-03-08T00:07:36Z [verbose] Readiness Indicator file check\\\\n2026-03-08T00:08:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.390759 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9klvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02de296b-0485-4f21-abf9-51043545b565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9klvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.404469 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18d4c436-d96e-4238-a331-e31bbba3ef13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4f2e2a2032fc81a42fc85a39850f466a62c05bac6854649c6f1cf4cd351d20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3be2a9168107359e36f3374d00388edf302f4f04e75b6341365adc72fa8fc5e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd7083511dc3876b161d2a5d4bdb150add9f6dac94659eb413736834dbdf0e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.415349 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.426871 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.435914 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.448342 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.540549 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:22 crc kubenswrapper[4713]: E0308 00:08:22.540804 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.202773 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fh96f_bf95e3f7-808b-434f-8fd4-c7e7365a1561/kube-multus/0.log" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.202840 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fh96f" event={"ID":"bf95e3f7-808b-434f-8fd4-c7e7365a1561","Type":"ContainerStarted","Data":"889d2148380bf677798262abdd95c84d2fd000431e7c34ae8b9e128afe19e86f"} Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.220284 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.234260 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.246543 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.260415 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.272252 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.296577 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.308702 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.319699 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f9429f468fa364a9888992c1fc62dff1b17294ce018fee40d6bc63ebee8c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486f1bf6be2e719226620d95e54e8e22a36b59998eb9cac6154f86fc5675234c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.338543 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.358583 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"message\\\":\\\"led to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z]\\\\nI0308 00:08:03.483432 7001 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 00:08:03.483433 7001 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-fp2h2\\\\nI0308 00:08:03.483436 7001 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 00:08:03.483440 7001 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 00:08:03.483444 7001 obj_retry.go:303] Retry \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:08:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.373037 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889d2148380bf677798262abdd95c84d2fd000431e7c34ae8b9e128afe19e86f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:21Z\\\",\\\"message\\\":\\\"2026-03-08T00:07:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3990e67c-099f-4787-bb76-e8e8b28a5f14\\\\n2026-03-08T00:07:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3990e67c-099f-4787-bb76-e8e8b28a5f14 to /host/opt/cni/bin/\\\\n2026-03-08T00:07:36Z [verbose] multus-daemon started\\\\n2026-03-08T00:07:36Z [verbose] Readiness Indicator file check\\\\n2026-03-08T00:08:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.384141 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9klvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02de296b-0485-4f21-abf9-51043545b565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9klvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.396263 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.404452 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:08:23 crc kubenswrapper[4713]: E0308 00:08:23.404553 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:27.404531333 +0000 UTC m=+221.524163566 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.410790 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.423510 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18d4c436-d96e-4238-a331-e31bbba3ef13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4f2e2a2032fc81a42fc85a39850f466a62c05bac6854649c6f1cf4cd351d20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3be2a9168107359e36f3374d00388edf302f4f04e75b6341365adc72fa8fc5e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd7083511dc3876b161d2a5d4bdb150add9f6dac94659eb413736834dbdf0e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.437703 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.450147 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.505985 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.506058 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.506100 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.506126 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:23 crc kubenswrapper[4713]: E0308 00:08:23.506143 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:08:23 crc kubenswrapper[4713]: E0308 00:08:23.506228 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:27.506208048 +0000 UTC m=+221.625840281 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:08:23 crc kubenswrapper[4713]: E0308 00:08:23.506244 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:08:23 crc kubenswrapper[4713]: E0308 00:08:23.506285 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:08:23 crc kubenswrapper[4713]: E0308 00:08:23.506325 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:08:23 crc kubenswrapper[4713]: E0308 00:08:23.506303 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:27.50628544 +0000 UTC m=+221.625917753 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:08:23 crc kubenswrapper[4713]: E0308 00:08:23.506340 4713 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:08:23 crc kubenswrapper[4713]: E0308 00:08:23.506423 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:08:23 crc kubenswrapper[4713]: E0308 00:08:23.506469 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:08:23 crc kubenswrapper[4713]: E0308 00:08:23.506487 4713 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:08:23 crc kubenswrapper[4713]: E0308 00:08:23.506517 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:27.506484385 +0000 UTC m=+221.626116658 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:08:23 crc kubenswrapper[4713]: E0308 00:08:23.506546 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:27.506530886 +0000 UTC m=+221.626163129 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.540044 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.540070 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:23 crc kubenswrapper[4713]: E0308 00:08:23.540182 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.540264 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:23 crc kubenswrapper[4713]: E0308 00:08:23.540388 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:23 crc kubenswrapper[4713]: E0308 00:08:23.540545 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:24 crc kubenswrapper[4713]: I0308 00:08:24.541073 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:24 crc kubenswrapper[4713]: E0308 00:08:24.541288 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.395310 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.395346 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.395354 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.395367 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.395376 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:25Z","lastTransitionTime":"2026-03-08T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:25 crc kubenswrapper[4713]: E0308 00:08:25.417152 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:25Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.423912 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.423960 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.423977 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.424002 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.424019 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:25Z","lastTransitionTime":"2026-03-08T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:25 crc kubenswrapper[4713]: E0308 00:08:25.446982 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:25Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.455018 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.455074 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.455092 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.455118 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.455134 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:25Z","lastTransitionTime":"2026-03-08T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:25 crc kubenswrapper[4713]: E0308 00:08:25.469209 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:25Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.474516 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.474676 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.474777 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.474930 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.475033 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:25Z","lastTransitionTime":"2026-03-08T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:25 crc kubenswrapper[4713]: E0308 00:08:25.489944 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:25Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.494459 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.494522 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.494539 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.494562 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.494582 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:25Z","lastTransitionTime":"2026-03-08T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:25 crc kubenswrapper[4713]: E0308 00:08:25.507979 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:25Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:25 crc kubenswrapper[4713]: E0308 00:08:25.508111 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.540541 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.540567 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.540541 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:25 crc kubenswrapper[4713]: E0308 00:08:25.540647 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:25 crc kubenswrapper[4713]: E0308 00:08:25.540730 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:25 crc kubenswrapper[4713]: E0308 00:08:25.540777 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.540640 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:26 crc kubenswrapper[4713]: E0308 00:08:26.541088 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.553904 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.555524 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.588340 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"message\\\":\\\"led to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z]\\\\nI0308 00:08:03.483432 7001 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 00:08:03.483433 7001 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-fp2h2\\\\nI0308 00:08:03.483436 7001 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 00:08:03.483440 7001 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 00:08:03.483444 7001 obj_retry.go:303] Retry \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:08:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.605478 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889d2148380bf677798262abdd95c84d2fd000431e7c34ae8b9e128afe19e86f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:21Z\\\",\\\"message\\\":\\\"2026-03-08T00:07:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3990e67c-099f-4787-bb76-e8e8b28a5f14\\\\n2026-03-08T00:07:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3990e67c-099f-4787-bb76-e8e8b28a5f14 to /host/opt/cni/bin/\\\\n2026-03-08T00:07:36Z [verbose] multus-daemon started\\\\n2026-03-08T00:07:36Z [verbose] Readiness Indicator file check\\\\n2026-03-08T00:08:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.623646 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9klvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02de296b-0485-4f21-abf9-51043545b565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9klvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:26 crc kubenswrapper[4713]: E0308 00:08:26.647753 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.648891 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.667415 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18d4c436-d96e-4238-a331-e31bbba3ef13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4f2e2a2032fc81a42fc85a39850f466a62c05bac6854649c6f1cf4cd351d20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3be2a9168107359e36f3374d00388edf302f4f04e75b6341365adc72fa8fc5e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd7083511dc3876b161d2a5d4bdb150add9f6dac94659eb413736834dbdf0e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.687432 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.705234 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.720131 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.738508 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.758916 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.776304 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.791041 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.823084 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.842289 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.854892 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.866754 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f9429f468fa364a9888992c1fc62dff1b17294ce018fee40d6bc63ebee8c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486f1bf6be2e719226620d95e54e8e22a36b59998eb9cac6154f86fc5675234c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:27 crc kubenswrapper[4713]: I0308 00:08:27.540619 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:27 crc kubenswrapper[4713]: I0308 00:08:27.540748 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:27 crc kubenswrapper[4713]: I0308 00:08:27.540813 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:27 crc kubenswrapper[4713]: E0308 00:08:27.541060 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:27 crc kubenswrapper[4713]: E0308 00:08:27.541310 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:27 crc kubenswrapper[4713]: E0308 00:08:27.541471 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:28 crc kubenswrapper[4713]: I0308 00:08:28.540441 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:28 crc kubenswrapper[4713]: E0308 00:08:28.540646 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:29 crc kubenswrapper[4713]: I0308 00:08:29.540258 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:29 crc kubenswrapper[4713]: I0308 00:08:29.540321 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:29 crc kubenswrapper[4713]: I0308 00:08:29.540382 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:29 crc kubenswrapper[4713]: E0308 00:08:29.541152 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:29 crc kubenswrapper[4713]: E0308 00:08:29.541255 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:29 crc kubenswrapper[4713]: E0308 00:08:29.541002 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:30 crc kubenswrapper[4713]: I0308 00:08:30.540757 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:30 crc kubenswrapper[4713]: E0308 00:08:30.541208 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:30 crc kubenswrapper[4713]: I0308 00:08:30.553886 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 08 00:08:31 crc kubenswrapper[4713]: I0308 00:08:31.540576 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:31 crc kubenswrapper[4713]: E0308 00:08:31.540751 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:31 crc kubenswrapper[4713]: I0308 00:08:31.540586 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:31 crc kubenswrapper[4713]: I0308 00:08:31.540576 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:31 crc kubenswrapper[4713]: E0308 00:08:31.541209 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:31 crc kubenswrapper[4713]: I0308 00:08:31.541392 4713 scope.go:117] "RemoveContainer" containerID="5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406" Mar 08 00:08:31 crc kubenswrapper[4713]: E0308 00:08:31.541424 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:31 crc kubenswrapper[4713]: E0308 00:08:31.650022 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.234765 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovnkube-controller/2.log" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.237965 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerStarted","Data":"cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e"} Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.238407 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.254423 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.268241 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.278629 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.295644 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.307746 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18d4c436-d96e-4238-a331-e31bbba3ef13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4f2e2a2032fc81a42fc85a39850f466a62c05bac6854649c6f1cf4cd351d20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3be2a9168107359e36f3374d00388edf302f4f04e75b6341365adc72fa8fc5e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd7083511dc3876b161d2a5d4bdb150add9f6dac94659eb413736834dbdf0e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.323619 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.338561 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfed0950-276b-4126-a600-1031513708f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea7e2638bea2767584ec8289d15911e98d3f0a7ae48a032b89b4466bd807e8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bd9c48a8ffb3ecc96d21e191df7975812e597dc665a5487517ab278f89515cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0308 00:06:12.289983 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0308 00:06:12.291350 1 observer_polling.go:159] Starting file observer\\\\nI0308 00:06:12.292878 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0308 00:06:12.293790 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0308 00:06:41.970411 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0308 00:06:41.970630 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:41Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:12Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b889b5cdcdafac4c08a37ddbf65fe6148e451c41914c8963bf50be9c84e84414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b67e28c29833077f4c11144409783e14d6a3b1875012c1e86c576cae0b38e46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59d7811343e8c519ce7d8d96d1ef70f2cecb384c1fe32fcee17e814e5abb99b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.350907 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.361633 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.371216 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.380248 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.390075 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.407433 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.416763 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f9429f468fa364a9888992c1fc62dff1b17294ce018fee40d6bc63ebee8c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486f1bf6be2e719226620d95e54e8e22a36b59998eb9cac6154f86fc5675234c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.430680 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.461658 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"message\\\":\\\"led to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z]\\\\nI0308 00:08:03.483432 7001 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 00:08:03.483433 7001 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-fp2h2\\\\nI0308 00:08:03.483436 7001 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 00:08:03.483440 7001 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 00:08:03.483444 7001 obj_retry.go:303] Retry \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:08:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.480207 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"773e859d-0b8b-4dd0-87d1-2987e2092881\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a361c383172f4481b046398c6a434f347b26cf18a9b0c2d77652114eb089de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d658364e9c1f5f65d5e924ee33045fcbbd5d465c9efbf86c8f03dfcf5dc36675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d658364e9c1f5f65d5e924ee33045fcbbd5d465c9efbf86c8f03dfcf5dc36675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.496450 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9klvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02de296b-0485-4f21-abf9-51043545b565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9klvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.514445 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889d2148380bf677798262abdd95c84d2fd000431e7c34ae8b9e128afe19e86f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:21Z\\\",\\\"message\\\":\\\"2026-03-08T00:07:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3990e67c-099f-4787-bb76-e8e8b28a5f14\\\\n2026-03-08T00:07:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3990e67c-099f-4787-bb76-e8e8b28a5f14 to /host/opt/cni/bin/\\\\n2026-03-08T00:07:36Z [verbose] multus-daemon started\\\\n2026-03-08T00:07:36Z [verbose] Readiness Indicator file check\\\\n2026-03-08T00:08:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.540723 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:32 crc kubenswrapper[4713]: E0308 00:08:32.540927 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.242860 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovnkube-controller/3.log" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.243600 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovnkube-controller/2.log" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.246344 4713 generic.go:334] "Generic (PLEG): container finished" podID="56fbba07-87e8-4e77-b834-ed68af718d11" containerID="cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e" exitCode=1 Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.246390 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerDied","Data":"cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e"} Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.246427 4713 scope.go:117] "RemoveContainer" containerID="5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.247141 4713 scope.go:117] "RemoveContainer" containerID="cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e" Mar 08 00:08:33 crc kubenswrapper[4713]: E0308 00:08:33.247322 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.263534 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.274383 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.298292 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.318252 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18d4c436-d96e-4238-a331-e31bbba3ef13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4f2e2a2032fc81a42fc85a39850f466a62c05bac6854649c6f1cf4cd351d20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3be2a9168107359e36f3374d00388edf302f4f04e75b6341365adc72fa8fc5e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd7083511dc3876b161d2a5d4bdb150add9f6dac94659eb413736834dbdf0e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.334345 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.352328 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfed0950-276b-4126-a600-1031513708f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea7e2638bea2767584ec8289d15911e98d3f0a7ae48a032b89b4466bd807e8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bd9c48a8ffb3ecc96d21e191df7975812e597dc665a5487517ab278f89515cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0308 00:06:12.289983 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0308 00:06:12.291350 1 observer_polling.go:159] Starting file observer\\\\nI0308 00:06:12.292878 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0308 00:06:12.293790 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0308 00:06:41.970411 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0308 00:06:41.970630 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:41Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:12Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b889b5cdcdafac4c08a37ddbf65fe6148e451c41914c8963bf50be9c84e84414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b67e28c29833077f4c11144409783e14d6a3b1875012c1e86c576cae0b38e46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59d7811343e8c519ce7d8d96d1ef70f2cecb384c1fe32fcee17e814e5abb99b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.371892 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.391679 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.412004 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.429405 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.444000 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.472324 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.487096 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.501290 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f9429f468fa364a9888992c1fc62dff1b17294ce018fee40d6bc63ebee8c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486f1bf6be2e719226620d95e54e8e22a36b59998eb9cac6154f86fc5675234c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.522223 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"message\\\":\\\"led to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z]\\\\nI0308 00:08:03.483432 7001 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 00:08:03.483433 7001 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-fp2h2\\\\nI0308 00:08:03.483436 7001 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 00:08:03.483440 7001 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 00:08:03.483444 7001 obj_retry.go:303] Retry \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:08:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:32Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 00:08:32.539634 7335 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0308 00:08:32.539681 7335 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0308 00:08:32.539725 7335 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0308 00:08:32.539812 7335 factory.go:1336] Added *v1.Node event handler 7\\\\nI0308 00:08:32.539895 7335 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0308 00:08:32.540296 7335 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0308 00:08:32.540403 7335 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0308 00:08:32.542051 7335 ovnkube.go:599] Stopped ovnkube\\\\nI0308 00:08:32.542107 7335 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0308 00:08:32.542214 7335 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.534695 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"773e859d-0b8b-4dd0-87d1-2987e2092881\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a361c383172f4481b046398c6a434f347b26cf18a9b0c2d77652114eb089de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d658364e9c1f5f65d5e924ee33045fcbbd5d465c9efbf86c8f03dfcf5dc36675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d658364e9c1f5f65d5e924ee33045fcbbd5d465c9efbf86c8f03dfcf5dc36675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.540338 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.540377 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.540399 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:33 crc kubenswrapper[4713]: E0308 00:08:33.540503 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:33 crc kubenswrapper[4713]: E0308 00:08:33.540599 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:33 crc kubenswrapper[4713]: E0308 00:08:33.540699 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.548872 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.562423 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889d2148380bf677798262abdd95c84d2fd000431e7c34ae8b9e128afe19e86f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:21Z\\\",\\\"message\\\":\\\"2026-03-08T00:07:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3990e67c-099f-4787-bb76-e8e8b28a5f14\\\\n2026-03-08T00:07:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3990e67c-099f-4787-bb76-e8e8b28a5f14 to /host/opt/cni/bin/\\\\n2026-03-08T00:07:36Z [verbose] multus-daemon started\\\\n2026-03-08T00:07:36Z [verbose] Readiness Indicator file check\\\\n2026-03-08T00:08:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.573756 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9klvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02de296b-0485-4f21-abf9-51043545b565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9klvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.251925 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovnkube-controller/3.log" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.255692 4713 scope.go:117] "RemoveContainer" containerID="cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e" Mar 08 00:08:34 crc kubenswrapper[4713]: E0308 00:08:34.256017 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.275405 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18d4c436-d96e-4238-a331-e31bbba3ef13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4f2e2a2032fc81a42fc85a39850f466a62c05bac6854649c6f1cf4cd351d20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3be2a9168107359e36f3374d00388edf302f4f04e75b6341365adc72fa8fc5e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd7083511dc3876b161d2a5d4bdb150add9f6dac94659eb413736834dbdf0e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.294186 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.310455 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.325425 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.342576 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.356936 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.367512 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.380008 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.398929 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.412678 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.425592 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfed0950-276b-4126-a600-1031513708f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea7e2638bea2767584ec8289d15911e98d3f0a7ae48a032b89b4466bd807e8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bd9c48a8ffb3ecc96d21e191df7975812e597dc665a5487517ab278f89515cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0308 00:06:12.289983 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0308 00:06:12.291350 1 observer_polling.go:159] Starting file observer\\\\nI0308 00:06:12.292878 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0308 00:06:12.293790 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0308 00:06:41.970411 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0308 00:06:41.970630 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:41Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:12Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b889b5cdcdafac4c08a37ddbf65fe6148e451c41914c8963bf50be9c84e84414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b67e28c29833077f4c11144409783e14d6a3b1875012c1e86c576cae0b38e46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59d7811343e8c519ce7d8d96d1ef70f2cecb384c1fe32fcee17e814e5abb99b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.436952 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.448404 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.462238 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f9429f468fa364a9888992c1fc62dff1b17294ce018fee40d6bc63ebee8c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486f1bf6be2e719226620d95e54e8e22a36b59998eb9cac6154f86fc5675234c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.472202 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"773e859d-0b8b-4dd0-87d1-2987e2092881\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a361c383172f4481b046398c6a434f347b26cf18a9b0c2d77652114eb089de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d658364e9c1f5f65d5e924ee33045fcbbd5d465c9efbf86c8f03dfcf5dc36675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d658364e9c1f5f65d5e924ee33045fcbbd5d465c9efbf86c8f03dfcf5dc36675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.489121 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.514111 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:32Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 00:08:32.539634 7335 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0308 00:08:32.539681 7335 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0308 00:08:32.539725 7335 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0308 00:08:32.539812 7335 factory.go:1336] Added *v1.Node event handler 7\\\\nI0308 00:08:32.539895 7335 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0308 00:08:32.540296 7335 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0308 00:08:32.540403 7335 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0308 00:08:32.542051 7335 ovnkube.go:599] Stopped ovnkube\\\\nI0308 00:08:32.542107 7335 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0308 00:08:32.542214 7335 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:08:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.529587 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889d2148380bf677798262abdd95c84d2fd000431e7c34ae8b9e128afe19e86f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:21Z\\\",\\\"message\\\":\\\"2026-03-08T00:07:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3990e67c-099f-4787-bb76-e8e8b28a5f14\\\\n2026-03-08T00:07:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3990e67c-099f-4787-bb76-e8e8b28a5f14 to /host/opt/cni/bin/\\\\n2026-03-08T00:07:36Z [verbose] multus-daemon started\\\\n2026-03-08T00:07:36Z [verbose] Readiness Indicator file check\\\\n2026-03-08T00:08:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.539630 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9klvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02de296b-0485-4f21-abf9-51043545b565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9klvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.540938 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:34 crc kubenswrapper[4713]: E0308 00:08:34.541080 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.540725 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:35 crc kubenswrapper[4713]: E0308 00:08:35.541340 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.541020 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:35 crc kubenswrapper[4713]: E0308 00:08:35.541461 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.540750 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:35 crc kubenswrapper[4713]: E0308 00:08:35.541562 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.727512 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.727553 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.727565 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.727608 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.727621 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:35Z","lastTransitionTime":"2026-03-08T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:35 crc kubenswrapper[4713]: E0308 00:08:35.742239 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.746922 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.747014 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.747030 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.747052 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.747069 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:35Z","lastTransitionTime":"2026-03-08T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:35 crc kubenswrapper[4713]: E0308 00:08:35.761594 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.765259 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.765309 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.765322 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.765340 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.765353 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:35Z","lastTransitionTime":"2026-03-08T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:35 crc kubenswrapper[4713]: E0308 00:08:35.779388 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.783584 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.783619 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.783627 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.783668 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.783678 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:35Z","lastTransitionTime":"2026-03-08T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:35 crc kubenswrapper[4713]: E0308 00:08:35.803006 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.806960 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.806995 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.807008 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.807026 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.807038 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:35Z","lastTransitionTime":"2026-03-08T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:35 crc kubenswrapper[4713]: E0308 00:08:35.825932 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:35 crc kubenswrapper[4713]: E0308 00:08:35.826204 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.540388 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:36 crc kubenswrapper[4713]: E0308 00:08:36.540597 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.563555 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfed0950-276b-4126-a600-1031513708f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea7e2638bea2767584ec8289d15911e98d3f0a7ae48a032b89b4466bd807e8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bd9c48a8ffb3ecc96d21e191df7975812e597dc665a5487517ab278f89515cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0308 00:06:12.289983 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0308 00:06:12.291350 1 observer_polling.go:159] Starting file observer\\\\nI0308 00:06:12.292878 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0308 00:06:12.293790 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0308 00:06:41.970411 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0308 00:06:41.970630 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:41Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:12Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b889b5cdcdafac4c08a37ddbf65fe6148e451c41914c8963bf50be9c84e84414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b67e28c29833077f4c11144409783e14d6a3b1875012c1e86c576cae0b38e46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59d7811343e8c519ce7d8d96d1ef70f2cecb384c1fe32fcee17e814e5abb99b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.583123 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.599912 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.618012 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.637000 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: E0308 00:08:36.651196 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.652643 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.679874 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.698729 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.715412 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f9429f468fa364a9888992c1fc62dff1b17294ce018fee40d6bc63ebee8c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486f1bf6be2e719226620d95e54e8e22a36b59998eb9cac6154f86fc5675234c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.740590 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:32Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 00:08:32.539634 7335 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0308 00:08:32.539681 7335 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0308 00:08:32.539725 7335 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0308 00:08:32.539812 7335 factory.go:1336] Added *v1.Node event handler 7\\\\nI0308 00:08:32.539895 7335 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0308 00:08:32.540296 7335 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0308 00:08:32.540403 7335 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0308 00:08:32.542051 7335 ovnkube.go:599] Stopped ovnkube\\\\nI0308 00:08:32.542107 7335 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0308 00:08:32.542214 7335 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:08:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.756985 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"773e859d-0b8b-4dd0-87d1-2987e2092881\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a361c383172f4481b046398c6a434f347b26cf18a9b0c2d77652114eb089de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d658364e9c1f5f65d5e924ee33045fcbbd5d465c9efbf86c8f03dfcf5dc36675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d658364e9c1f5f65d5e924ee33045fcbbd5d465c9efbf86c8f03dfcf5dc36675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.806323 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.822610 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889d2148380bf677798262abdd95c84d2fd000431e7c34ae8b9e128afe19e86f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:21Z\\\",\\\"message\\\":\\\"2026-03-08T00:07:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3990e67c-099f-4787-bb76-e8e8b28a5f14\\\\n2026-03-08T00:07:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3990e67c-099f-4787-bb76-e8e8b28a5f14 to /host/opt/cni/bin/\\\\n2026-03-08T00:07:36Z [verbose] multus-daemon started\\\\n2026-03-08T00:07:36Z [verbose] Readiness Indicator file check\\\\n2026-03-08T00:08:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.838748 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9klvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02de296b-0485-4f21-abf9-51043545b565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9klvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.854765 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.867217 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.882683 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.896413 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18d4c436-d96e-4238-a331-e31bbba3ef13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4f2e2a2032fc81a42fc85a39850f466a62c05bac6854649c6f1cf4cd351d20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3be2a9168107359e36f3374d00388edf302f4f04e75b6341365adc72fa8fc5e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd7083511dc3876b161d2a5d4bdb150add9f6dac94659eb413736834dbdf0e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.914205 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:37 crc kubenswrapper[4713]: I0308 00:08:37.540106 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:37 crc kubenswrapper[4713]: I0308 00:08:37.540355 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:37 crc kubenswrapper[4713]: I0308 00:08:37.540444 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:37 crc kubenswrapper[4713]: E0308 00:08:37.540486 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:37 crc kubenswrapper[4713]: E0308 00:08:37.540606 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:37 crc kubenswrapper[4713]: E0308 00:08:37.540885 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:38 crc kubenswrapper[4713]: I0308 00:08:38.540299 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:38 crc kubenswrapper[4713]: E0308 00:08:38.540523 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:39 crc kubenswrapper[4713]: I0308 00:08:39.540619 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:39 crc kubenswrapper[4713]: I0308 00:08:39.540622 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:39 crc kubenswrapper[4713]: E0308 00:08:39.540776 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:39 crc kubenswrapper[4713]: E0308 00:08:39.540956 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:39 crc kubenswrapper[4713]: I0308 00:08:39.542004 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:39 crc kubenswrapper[4713]: E0308 00:08:39.542281 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:40 crc kubenswrapper[4713]: I0308 00:08:40.541016 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:40 crc kubenswrapper[4713]: E0308 00:08:40.542109 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:41 crc kubenswrapper[4713]: I0308 00:08:41.540368 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:41 crc kubenswrapper[4713]: I0308 00:08:41.540404 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:41 crc kubenswrapper[4713]: E0308 00:08:41.540480 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:41 crc kubenswrapper[4713]: I0308 00:08:41.540502 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:41 crc kubenswrapper[4713]: E0308 00:08:41.540650 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:41 crc kubenswrapper[4713]: E0308 00:08:41.540683 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:41 crc kubenswrapper[4713]: E0308 00:08:41.652645 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:08:42 crc kubenswrapper[4713]: I0308 00:08:42.540493 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:42 crc kubenswrapper[4713]: E0308 00:08:42.540688 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:43 crc kubenswrapper[4713]: I0308 00:08:43.540941 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:43 crc kubenswrapper[4713]: I0308 00:08:43.540971 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:43 crc kubenswrapper[4713]: I0308 00:08:43.540945 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:43 crc kubenswrapper[4713]: E0308 00:08:43.541187 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:43 crc kubenswrapper[4713]: E0308 00:08:43.541285 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:43 crc kubenswrapper[4713]: E0308 00:08:43.541421 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:44 crc kubenswrapper[4713]: I0308 00:08:44.540973 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:44 crc kubenswrapper[4713]: E0308 00:08:44.541162 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:45 crc kubenswrapper[4713]: I0308 00:08:45.540025 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:45 crc kubenswrapper[4713]: I0308 00:08:45.540107 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:45 crc kubenswrapper[4713]: I0308 00:08:45.540037 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:45 crc kubenswrapper[4713]: E0308 00:08:45.540215 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:45 crc kubenswrapper[4713]: E0308 00:08:45.540364 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:45 crc kubenswrapper[4713]: E0308 00:08:45.540491 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:45 crc kubenswrapper[4713]: I0308 00:08:45.939752 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:45 crc kubenswrapper[4713]: I0308 00:08:45.939850 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:45 crc kubenswrapper[4713]: I0308 00:08:45.939865 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:45 crc kubenswrapper[4713]: I0308 00:08:45.939886 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:45 crc kubenswrapper[4713]: I0308 00:08:45.939899 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:45Z","lastTransitionTime":"2026-03-08T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.017984 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j"] Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.018524 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.020419 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.020668 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.021234 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.022231 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.052925 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=16.052903051 podStartE2EDuration="16.052903051s" podCreationTimestamp="2026-03-08 00:08:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:08:46.035637328 +0000 UTC m=+180.155269571" watchObservedRunningTime="2026-03-08 00:08:46.052903051 +0000 UTC m=+180.172535284" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.124803 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-fh96f" podStartSLOduration=108.124786588 podStartE2EDuration="1m48.124786588s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:08:46.111157075 +0000 UTC m=+180.230789308" watchObservedRunningTime="2026-03-08 00:08:46.124786588 +0000 UTC m=+180.244418821" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.139540 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=42.139527628 podStartE2EDuration="42.139527628s" podCreationTimestamp="2026-03-08 00:08:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:08:46.139482137 +0000 UTC m=+180.259114410" watchObservedRunningTime="2026-03-08 00:08:46.139527628 +0000 UTC m=+180.259159861" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.163081 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/83f7b3f3-83a6-447a-8858-960ae6c3006f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-zrg6j\" (UID: \"83f7b3f3-83a6-447a-8858-960ae6c3006f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.163182 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/83f7b3f3-83a6-447a-8858-960ae6c3006f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-zrg6j\" (UID: \"83f7b3f3-83a6-447a-8858-960ae6c3006f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.163220 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83f7b3f3-83a6-447a-8858-960ae6c3006f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-zrg6j\" (UID: \"83f7b3f3-83a6-447a-8858-960ae6c3006f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.163260 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83f7b3f3-83a6-447a-8858-960ae6c3006f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-zrg6j\" (UID: \"83f7b3f3-83a6-447a-8858-960ae6c3006f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.163474 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/83f7b3f3-83a6-447a-8858-960ae6c3006f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-zrg6j\" (UID: \"83f7b3f3-83a6-447a-8858-960ae6c3006f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.193949 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-fp2h2" podStartSLOduration=109.193930355 podStartE2EDuration="1m49.193930355s" podCreationTimestamp="2026-03-08 00:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:08:46.193625657 +0000 UTC m=+180.313257900" watchObservedRunningTime="2026-03-08 00:08:46.193930355 +0000 UTC m=+180.313562588" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.218939 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-54zzt" podStartSLOduration=108.218914693 podStartE2EDuration="1m48.218914693s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:08:46.218238086 +0000 UTC m=+180.337870329" watchObservedRunningTime="2026-03-08 00:08:46.218914693 +0000 UTC m=+180.338546966" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.258318 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podStartSLOduration=109.258291482 podStartE2EDuration="1m49.258291482s" podCreationTimestamp="2026-03-08 00:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:08:46.248192358 +0000 UTC m=+180.367824621" watchObservedRunningTime="2026-03-08 00:08:46.258291482 +0000 UTC m=+180.377923745" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.258728 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-d9bpk" podStartSLOduration=109.258719143 podStartE2EDuration="1m49.258719143s" podCreationTimestamp="2026-03-08 00:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:08:46.258651011 +0000 UTC m=+180.378283254" watchObservedRunningTime="2026-03-08 00:08:46.258719143 +0000 UTC m=+180.378351416" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.264399 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/83f7b3f3-83a6-447a-8858-960ae6c3006f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-zrg6j\" (UID: \"83f7b3f3-83a6-447a-8858-960ae6c3006f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.264569 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/83f7b3f3-83a6-447a-8858-960ae6c3006f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-zrg6j\" (UID: \"83f7b3f3-83a6-447a-8858-960ae6c3006f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.264775 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/83f7b3f3-83a6-447a-8858-960ae6c3006f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-zrg6j\" (UID: \"83f7b3f3-83a6-447a-8858-960ae6c3006f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.265026 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/83f7b3f3-83a6-447a-8858-960ae6c3006f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-zrg6j\" (UID: \"83f7b3f3-83a6-447a-8858-960ae6c3006f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.264907 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/83f7b3f3-83a6-447a-8858-960ae6c3006f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-zrg6j\" (UID: \"83f7b3f3-83a6-447a-8858-960ae6c3006f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.265194 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83f7b3f3-83a6-447a-8858-960ae6c3006f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-zrg6j\" (UID: \"83f7b3f3-83a6-447a-8858-960ae6c3006f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.265440 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83f7b3f3-83a6-447a-8858-960ae6c3006f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-zrg6j\" (UID: \"83f7b3f3-83a6-447a-8858-960ae6c3006f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.267640 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/83f7b3f3-83a6-447a-8858-960ae6c3006f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-zrg6j\" (UID: \"83f7b3f3-83a6-447a-8858-960ae6c3006f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.270785 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83f7b3f3-83a6-447a-8858-960ae6c3006f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-zrg6j\" (UID: \"83f7b3f3-83a6-447a-8858-960ae6c3006f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.286559 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83f7b3f3-83a6-447a-8858-960ae6c3006f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-zrg6j\" (UID: \"83f7b3f3-83a6-447a-8858-960ae6c3006f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.311985 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=85.311956211 podStartE2EDuration="1m25.311956211s" podCreationTimestamp="2026-03-08 00:07:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:08:46.311417767 +0000 UTC m=+180.431050020" watchObservedRunningTime="2026-03-08 00:08:46.311956211 +0000 UTC m=+180.431588484" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.312568 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=67.312551006 podStartE2EDuration="1m7.312551006s" podCreationTimestamp="2026-03-08 00:07:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:08:46.291729042 +0000 UTC m=+180.411361285" watchObservedRunningTime="2026-03-08 00:08:46.312551006 +0000 UTC m=+180.432183279" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.324296 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=20.32427949 podStartE2EDuration="20.32427949s" podCreationTimestamp="2026-03-08 00:08:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:08:46.323155552 +0000 UTC m=+180.442787805" watchObservedRunningTime="2026-03-08 00:08:46.32427949 +0000 UTC m=+180.443911743" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.337101 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: W0308 00:08:46.352606 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83f7b3f3_83a6_447a_8858_960ae6c3006f.slice/crio-ef2a9f0ecf98ad897fbf467736af398c43d9c2440e4ead7712886762f90557c8 WatchSource:0}: Error finding container ef2a9f0ecf98ad897fbf467736af398c43d9c2440e4ead7712886762f90557c8: Status 404 returned error can't find the container with id ef2a9f0ecf98ad897fbf467736af398c43d9c2440e4ead7712886762f90557c8 Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.392185 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" podStartSLOduration=108.392164636 podStartE2EDuration="1m48.392164636s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:08:46.376985375 +0000 UTC m=+180.496617598" watchObservedRunningTime="2026-03-08 00:08:46.392164636 +0000 UTC m=+180.511796869" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.540632 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:46 crc kubenswrapper[4713]: E0308 00:08:46.542586 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.586798 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.596393 4713 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 08 00:08:46 crc kubenswrapper[4713]: E0308 00:08:46.654315 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:08:47 crc kubenswrapper[4713]: I0308 00:08:47.301279 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" event={"ID":"83f7b3f3-83a6-447a-8858-960ae6c3006f","Type":"ContainerStarted","Data":"802fa8b46c29d33985b26a594c4dd0ef927c5969db901ae90aff984c19581262"} Mar 08 00:08:47 crc kubenswrapper[4713]: I0308 00:08:47.301335 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" event={"ID":"83f7b3f3-83a6-447a-8858-960ae6c3006f","Type":"ContainerStarted","Data":"ef2a9f0ecf98ad897fbf467736af398c43d9c2440e4ead7712886762f90557c8"} Mar 08 00:08:47 crc kubenswrapper[4713]: I0308 00:08:47.324601 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" podStartSLOduration=109.324566693 podStartE2EDuration="1m49.324566693s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:08:47.323419194 +0000 UTC m=+181.443051437" watchObservedRunningTime="2026-03-08 00:08:47.324566693 +0000 UTC m=+181.444198956" Mar 08 00:08:47 crc kubenswrapper[4713]: I0308 00:08:47.540620 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:47 crc kubenswrapper[4713]: I0308 00:08:47.540755 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:47 crc kubenswrapper[4713]: I0308 00:08:47.540938 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:47 crc kubenswrapper[4713]: E0308 00:08:47.541096 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:47 crc kubenswrapper[4713]: E0308 00:08:47.541280 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:47 crc kubenswrapper[4713]: E0308 00:08:47.541908 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:47 crc kubenswrapper[4713]: I0308 00:08:47.542638 4713 scope.go:117] "RemoveContainer" containerID="cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e" Mar 08 00:08:47 crc kubenswrapper[4713]: E0308 00:08:47.542977 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" Mar 08 00:08:48 crc kubenswrapper[4713]: I0308 00:08:48.540959 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:48 crc kubenswrapper[4713]: E0308 00:08:48.541171 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:49 crc kubenswrapper[4713]: I0308 00:08:49.540753 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:49 crc kubenswrapper[4713]: I0308 00:08:49.540870 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:49 crc kubenswrapper[4713]: I0308 00:08:49.540789 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:49 crc kubenswrapper[4713]: E0308 00:08:49.541008 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:49 crc kubenswrapper[4713]: E0308 00:08:49.541136 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:49 crc kubenswrapper[4713]: E0308 00:08:49.541287 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:50 crc kubenswrapper[4713]: I0308 00:08:50.540931 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:50 crc kubenswrapper[4713]: E0308 00:08:50.541130 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:51 crc kubenswrapper[4713]: I0308 00:08:51.323524 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs\") pod \"network-metrics-daemon-9klvz\" (UID: \"02de296b-0485-4f21-abf9-51043545b565\") " pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:51 crc kubenswrapper[4713]: E0308 00:08:51.323720 4713 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:08:51 crc kubenswrapper[4713]: E0308 00:08:51.323859 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs podName:02de296b-0485-4f21-abf9-51043545b565 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:55.32380577 +0000 UTC m=+249.443438043 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs") pod "network-metrics-daemon-9klvz" (UID: "02de296b-0485-4f21-abf9-51043545b565") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:08:51 crc kubenswrapper[4713]: I0308 00:08:51.540327 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:51 crc kubenswrapper[4713]: I0308 00:08:51.540390 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:51 crc kubenswrapper[4713]: I0308 00:08:51.540558 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:51 crc kubenswrapper[4713]: E0308 00:08:51.540702 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:51 crc kubenswrapper[4713]: E0308 00:08:51.540867 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:51 crc kubenswrapper[4713]: E0308 00:08:51.540983 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:51 crc kubenswrapper[4713]: E0308 00:08:51.655885 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:08:52 crc kubenswrapper[4713]: I0308 00:08:52.540651 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:52 crc kubenswrapper[4713]: E0308 00:08:52.540919 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:53 crc kubenswrapper[4713]: I0308 00:08:53.540547 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:53 crc kubenswrapper[4713]: I0308 00:08:53.540616 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:53 crc kubenswrapper[4713]: I0308 00:08:53.540691 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:53 crc kubenswrapper[4713]: E0308 00:08:53.540857 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:53 crc kubenswrapper[4713]: E0308 00:08:53.541017 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:53 crc kubenswrapper[4713]: E0308 00:08:53.541151 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:54 crc kubenswrapper[4713]: I0308 00:08:54.540552 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:54 crc kubenswrapper[4713]: E0308 00:08:54.540747 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:55 crc kubenswrapper[4713]: I0308 00:08:55.540134 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:55 crc kubenswrapper[4713]: I0308 00:08:55.540270 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:55 crc kubenswrapper[4713]: E0308 00:08:55.540377 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:55 crc kubenswrapper[4713]: E0308 00:08:55.540793 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:55 crc kubenswrapper[4713]: I0308 00:08:55.541029 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:55 crc kubenswrapper[4713]: E0308 00:08:55.541208 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:56 crc kubenswrapper[4713]: I0308 00:08:56.539984 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:56 crc kubenswrapper[4713]: E0308 00:08:56.543622 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:56 crc kubenswrapper[4713]: E0308 00:08:56.657674 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:08:57 crc kubenswrapper[4713]: I0308 00:08:57.540022 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:57 crc kubenswrapper[4713]: I0308 00:08:57.540088 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:57 crc kubenswrapper[4713]: I0308 00:08:57.540260 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:57 crc kubenswrapper[4713]: E0308 00:08:57.540960 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:57 crc kubenswrapper[4713]: E0308 00:08:57.541068 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:57 crc kubenswrapper[4713]: E0308 00:08:57.541571 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:58 crc kubenswrapper[4713]: I0308 00:08:58.540113 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:58 crc kubenswrapper[4713]: E0308 00:08:58.540427 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:59 crc kubenswrapper[4713]: I0308 00:08:59.540588 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:59 crc kubenswrapper[4713]: E0308 00:08:59.540769 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:59 crc kubenswrapper[4713]: I0308 00:08:59.540787 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:59 crc kubenswrapper[4713]: I0308 00:08:59.540771 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:59 crc kubenswrapper[4713]: E0308 00:08:59.541455 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:59 crc kubenswrapper[4713]: E0308 00:08:59.541699 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:59 crc kubenswrapper[4713]: I0308 00:08:59.541765 4713 scope.go:117] "RemoveContainer" containerID="cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e" Mar 08 00:08:59 crc kubenswrapper[4713]: E0308 00:08:59.542052 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" Mar 08 00:09:00 crc kubenswrapper[4713]: I0308 00:09:00.540413 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:00 crc kubenswrapper[4713]: E0308 00:09:00.540625 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:09:01 crc kubenswrapper[4713]: I0308 00:09:01.540633 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:01 crc kubenswrapper[4713]: E0308 00:09:01.540795 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:09:01 crc kubenswrapper[4713]: I0308 00:09:01.540634 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:01 crc kubenswrapper[4713]: I0308 00:09:01.540887 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:01 crc kubenswrapper[4713]: E0308 00:09:01.541303 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:09:01 crc kubenswrapper[4713]: E0308 00:09:01.541509 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:09:01 crc kubenswrapper[4713]: E0308 00:09:01.658430 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:09:02 crc kubenswrapper[4713]: I0308 00:09:02.540612 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:02 crc kubenswrapper[4713]: E0308 00:09:02.540759 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:09:03 crc kubenswrapper[4713]: I0308 00:09:03.539964 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:03 crc kubenswrapper[4713]: I0308 00:09:03.540006 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:03 crc kubenswrapper[4713]: I0308 00:09:03.540415 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:03 crc kubenswrapper[4713]: E0308 00:09:03.541462 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:09:03 crc kubenswrapper[4713]: E0308 00:09:03.541922 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:09:03 crc kubenswrapper[4713]: E0308 00:09:03.542112 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:09:04 crc kubenswrapper[4713]: I0308 00:09:04.540208 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:04 crc kubenswrapper[4713]: E0308 00:09:04.540439 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:09:05 crc kubenswrapper[4713]: I0308 00:09:05.540556 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:05 crc kubenswrapper[4713]: I0308 00:09:05.540684 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:05 crc kubenswrapper[4713]: E0308 00:09:05.540740 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:09:05 crc kubenswrapper[4713]: I0308 00:09:05.540582 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:05 crc kubenswrapper[4713]: E0308 00:09:05.540967 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:09:05 crc kubenswrapper[4713]: E0308 00:09:05.541069 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:09:06 crc kubenswrapper[4713]: I0308 00:09:06.540889 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:06 crc kubenswrapper[4713]: E0308 00:09:06.542143 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:09:06 crc kubenswrapper[4713]: E0308 00:09:06.659928 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:09:07 crc kubenswrapper[4713]: I0308 00:09:07.540591 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:07 crc kubenswrapper[4713]: I0308 00:09:07.540695 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:07 crc kubenswrapper[4713]: I0308 00:09:07.540732 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:07 crc kubenswrapper[4713]: E0308 00:09:07.540804 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:09:07 crc kubenswrapper[4713]: E0308 00:09:07.541051 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:09:07 crc kubenswrapper[4713]: E0308 00:09:07.541309 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:09:08 crc kubenswrapper[4713]: I0308 00:09:08.370474 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fh96f_bf95e3f7-808b-434f-8fd4-c7e7365a1561/kube-multus/1.log" Mar 08 00:09:08 crc kubenswrapper[4713]: I0308 00:09:08.371020 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fh96f_bf95e3f7-808b-434f-8fd4-c7e7365a1561/kube-multus/0.log" Mar 08 00:09:08 crc kubenswrapper[4713]: I0308 00:09:08.371058 4713 generic.go:334] "Generic (PLEG): container finished" podID="bf95e3f7-808b-434f-8fd4-c7e7365a1561" containerID="889d2148380bf677798262abdd95c84d2fd000431e7c34ae8b9e128afe19e86f" exitCode=1 Mar 08 00:09:08 crc kubenswrapper[4713]: I0308 00:09:08.371092 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fh96f" event={"ID":"bf95e3f7-808b-434f-8fd4-c7e7365a1561","Type":"ContainerDied","Data":"889d2148380bf677798262abdd95c84d2fd000431e7c34ae8b9e128afe19e86f"} Mar 08 00:09:08 crc kubenswrapper[4713]: I0308 00:09:08.371140 4713 scope.go:117] "RemoveContainer" containerID="f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2" Mar 08 00:09:08 crc kubenswrapper[4713]: I0308 00:09:08.371580 4713 scope.go:117] "RemoveContainer" containerID="889d2148380bf677798262abdd95c84d2fd000431e7c34ae8b9e128afe19e86f" Mar 08 00:09:08 crc kubenswrapper[4713]: E0308 00:09:08.371927 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-fh96f_openshift-multus(bf95e3f7-808b-434f-8fd4-c7e7365a1561)\"" pod="openshift-multus/multus-fh96f" podUID="bf95e3f7-808b-434f-8fd4-c7e7365a1561" Mar 08 00:09:08 crc kubenswrapper[4713]: I0308 00:09:08.540171 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:08 crc kubenswrapper[4713]: E0308 00:09:08.540296 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:09:09 crc kubenswrapper[4713]: I0308 00:09:09.374632 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fh96f_bf95e3f7-808b-434f-8fd4-c7e7365a1561/kube-multus/1.log" Mar 08 00:09:09 crc kubenswrapper[4713]: I0308 00:09:09.540319 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:09 crc kubenswrapper[4713]: I0308 00:09:09.540450 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:09 crc kubenswrapper[4713]: I0308 00:09:09.540513 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:09 crc kubenswrapper[4713]: E0308 00:09:09.540533 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:09:09 crc kubenswrapper[4713]: E0308 00:09:09.540579 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:09:09 crc kubenswrapper[4713]: E0308 00:09:09.540658 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:09:10 crc kubenswrapper[4713]: I0308 00:09:10.541055 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:10 crc kubenswrapper[4713]: E0308 00:09:10.541195 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:09:11 crc kubenswrapper[4713]: I0308 00:09:11.540959 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:11 crc kubenswrapper[4713]: I0308 00:09:11.541018 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:11 crc kubenswrapper[4713]: I0308 00:09:11.541090 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:11 crc kubenswrapper[4713]: E0308 00:09:11.541192 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:09:11 crc kubenswrapper[4713]: E0308 00:09:11.541958 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:09:11 crc kubenswrapper[4713]: E0308 00:09:11.542118 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:09:11 crc kubenswrapper[4713]: I0308 00:09:11.542559 4713 scope.go:117] "RemoveContainer" containerID="cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e" Mar 08 00:09:11 crc kubenswrapper[4713]: E0308 00:09:11.542869 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" Mar 08 00:09:11 crc kubenswrapper[4713]: E0308 00:09:11.661170 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:09:12 crc kubenswrapper[4713]: I0308 00:09:12.540804 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:12 crc kubenswrapper[4713]: E0308 00:09:12.541315 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:09:13 crc kubenswrapper[4713]: I0308 00:09:13.540635 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:13 crc kubenswrapper[4713]: I0308 00:09:13.540677 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:13 crc kubenswrapper[4713]: E0308 00:09:13.540776 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:09:13 crc kubenswrapper[4713]: I0308 00:09:13.540714 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:13 crc kubenswrapper[4713]: E0308 00:09:13.540887 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:09:13 crc kubenswrapper[4713]: E0308 00:09:13.540987 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:09:14 crc kubenswrapper[4713]: I0308 00:09:14.541051 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:14 crc kubenswrapper[4713]: E0308 00:09:14.541169 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:09:15 crc kubenswrapper[4713]: I0308 00:09:15.588053 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:15 crc kubenswrapper[4713]: I0308 00:09:15.588080 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:15 crc kubenswrapper[4713]: I0308 00:09:15.588142 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:15 crc kubenswrapper[4713]: E0308 00:09:15.588269 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:09:15 crc kubenswrapper[4713]: E0308 00:09:15.588566 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:09:15 crc kubenswrapper[4713]: E0308 00:09:15.588755 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:09:16 crc kubenswrapper[4713]: I0308 00:09:16.540042 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:16 crc kubenswrapper[4713]: E0308 00:09:16.541502 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:09:17 crc kubenswrapper[4713]: E0308 00:09:17.503816 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:09:17 crc kubenswrapper[4713]: I0308 00:09:17.540250 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:17 crc kubenswrapper[4713]: I0308 00:09:17.540273 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:17 crc kubenswrapper[4713]: E0308 00:09:17.540488 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:09:17 crc kubenswrapper[4713]: E0308 00:09:17.540767 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:09:17 crc kubenswrapper[4713]: I0308 00:09:17.540939 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:17 crc kubenswrapper[4713]: E0308 00:09:17.541018 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:09:18 crc kubenswrapper[4713]: I0308 00:09:18.541060 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:18 crc kubenswrapper[4713]: E0308 00:09:18.541217 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:09:19 crc kubenswrapper[4713]: I0308 00:09:19.540638 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:19 crc kubenswrapper[4713]: I0308 00:09:19.540638 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:19 crc kubenswrapper[4713]: E0308 00:09:19.540872 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:09:19 crc kubenswrapper[4713]: I0308 00:09:19.541105 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:19 crc kubenswrapper[4713]: E0308 00:09:19.541114 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:09:19 crc kubenswrapper[4713]: E0308 00:09:19.541176 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:09:20 crc kubenswrapper[4713]: I0308 00:09:20.540466 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:20 crc kubenswrapper[4713]: I0308 00:09:20.540897 4713 scope.go:117] "RemoveContainer" containerID="889d2148380bf677798262abdd95c84d2fd000431e7c34ae8b9e128afe19e86f" Mar 08 00:09:20 crc kubenswrapper[4713]: E0308 00:09:20.540921 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:09:21 crc kubenswrapper[4713]: I0308 00:09:21.519436 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fh96f_bf95e3f7-808b-434f-8fd4-c7e7365a1561/kube-multus/1.log" Mar 08 00:09:21 crc kubenswrapper[4713]: I0308 00:09:21.519489 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fh96f" event={"ID":"bf95e3f7-808b-434f-8fd4-c7e7365a1561","Type":"ContainerStarted","Data":"393edc0643830d2b79626badd9377f827d4c6be3099c83edaa7aaf6132513222"} Mar 08 00:09:21 crc kubenswrapper[4713]: I0308 00:09:21.540560 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:21 crc kubenswrapper[4713]: I0308 00:09:21.540574 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:21 crc kubenswrapper[4713]: I0308 00:09:21.540680 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:21 crc kubenswrapper[4713]: E0308 00:09:21.540778 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:09:21 crc kubenswrapper[4713]: E0308 00:09:21.540914 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:09:21 crc kubenswrapper[4713]: E0308 00:09:21.541057 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:09:22 crc kubenswrapper[4713]: E0308 00:09:22.505443 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:09:22 crc kubenswrapper[4713]: I0308 00:09:22.540573 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:22 crc kubenswrapper[4713]: E0308 00:09:22.540697 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:09:23 crc kubenswrapper[4713]: I0308 00:09:23.540693 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:23 crc kubenswrapper[4713]: I0308 00:09:23.540756 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:23 crc kubenswrapper[4713]: I0308 00:09:23.540814 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:23 crc kubenswrapper[4713]: E0308 00:09:23.541109 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:09:23 crc kubenswrapper[4713]: E0308 00:09:23.541316 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:09:23 crc kubenswrapper[4713]: I0308 00:09:23.541340 4713 scope.go:117] "RemoveContainer" containerID="cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e" Mar 08 00:09:23 crc kubenswrapper[4713]: E0308 00:09:23.541387 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:09:24 crc kubenswrapper[4713]: I0308 00:09:24.310050 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9klvz"] Mar 08 00:09:24 crc kubenswrapper[4713]: I0308 00:09:24.536532 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovnkube-controller/3.log" Mar 08 00:09:24 crc kubenswrapper[4713]: I0308 00:09:24.539266 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:24 crc kubenswrapper[4713]: E0308 00:09:24.539638 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:09:24 crc kubenswrapper[4713]: I0308 00:09:24.539898 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerStarted","Data":"824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d"} Mar 08 00:09:24 crc kubenswrapper[4713]: I0308 00:09:24.543339 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:09:24 crc kubenswrapper[4713]: I0308 00:09:24.543393 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:24 crc kubenswrapper[4713]: E0308 00:09:24.543459 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:09:24 crc kubenswrapper[4713]: I0308 00:09:24.566496 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podStartSLOduration=146.566479527 podStartE2EDuration="2m26.566479527s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:24.566017255 +0000 UTC m=+218.685649518" watchObservedRunningTime="2026-03-08 00:09:24.566479527 +0000 UTC m=+218.686111760" Mar 08 00:09:25 crc kubenswrapper[4713]: I0308 00:09:25.540454 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:25 crc kubenswrapper[4713]: I0308 00:09:25.540478 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:25 crc kubenswrapper[4713]: E0308 00:09:25.540615 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:09:25 crc kubenswrapper[4713]: E0308 00:09:25.540757 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:09:26 crc kubenswrapper[4713]: I0308 00:09:26.541094 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:26 crc kubenswrapper[4713]: E0308 00:09:26.543265 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:09:26 crc kubenswrapper[4713]: I0308 00:09:26.543652 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:26 crc kubenswrapper[4713]: E0308 00:09:26.543803 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.471108 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:27 crc kubenswrapper[4713]: E0308 00:09:27.471238 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:11:29.471218072 +0000 UTC m=+343.590850305 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.540085 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.540131 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.542277 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.542439 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.571498 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.571556 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.571581 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.571606 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:27 crc kubenswrapper[4713]: E0308 00:09:27.571676 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:09:27 crc kubenswrapper[4713]: E0308 00:09:27.571721 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:11:29.571708017 +0000 UTC m=+343.691340250 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:09:27 crc kubenswrapper[4713]: E0308 00:09:27.572046 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:09:27 crc kubenswrapper[4713]: E0308 00:09:27.572085 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:11:29.572073987 +0000 UTC m=+343.691706220 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.577578 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.577677 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.855285 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.863792 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.887883 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.930347 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-58c66"] Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.931065 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.931529 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4xznw"] Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.931988 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.934664 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.934665 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.935690 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.935811 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.935737 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.936589 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.936982 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.937131 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-fhq98"] Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.937713 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.939768 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-dkkh7"] Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.940369 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7"] Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.940807 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.941121 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl"] Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.941511 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.942179 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.945220 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l"] Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.945614 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.991298 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.991540 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.992127 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.992375 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.992383 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.992410 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.992470 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:27.999961 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-gk97q"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.000539 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.008642 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.008921 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.009077 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.009248 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.009663 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.009785 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.010061 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.010190 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.010303 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.010429 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.029321 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.029694 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.030239 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.030425 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.030533 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.030571 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.030687 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.030790 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.030929 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.031084 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.034127 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.034347 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.034418 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.034596 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.034861 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.034361 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.041956 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.044710 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.044729 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.044953 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.045002 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2k6nd"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.045071 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.045170 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.045267 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.045365 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-2k6nd" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.034396 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.042464 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.042578 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.045530 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.042695 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.042743 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.043013 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.045691 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.049988 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.053332 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-z4s84"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.053671 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.053744 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.063561 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-z4s84" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.065020 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.066458 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.067850 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.071977 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29548800-ghv4d"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.072907 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29548800-ghv4d" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.079617 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5cc5125-93f0-4709-afbd-7aa6a888b641-serving-cert\") pod \"route-controller-manager-6576b87f9c-7snq7\" (UID: \"c5cc5125-93f0-4709-afbd-7aa6a888b641\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.079653 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjfj6\" (UniqueName: \"kubernetes.io/projected/c6893b56-2395-4f91-9349-c23b48b957c8-kube-api-access-hjfj6\") pod \"machine-api-operator-5694c8668f-dkkh7\" (UID: \"c6893b56-2395-4f91-9349-c23b48b957c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.079673 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pt9w\" (UniqueName: \"kubernetes.io/projected/10940629-a0dc-4828-a913-20a754f4896b-kube-api-access-7pt9w\") pod \"authentication-operator-69f744f599-fhq98\" (UID: \"10940629-a0dc-4828-a913-20a754f4896b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.079692 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bfa92863-23f8-42d4-8e73-433bf546d304-audit\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.079725 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c61cbc0b-441e-4704-accf-35963b3758aa-audit-policies\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.079741 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz4bd\" (UniqueName: \"kubernetes.io/projected/c61cbc0b-441e-4704-accf-35963b3758aa-kube-api-access-tz4bd\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.079778 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bfa92863-23f8-42d4-8e73-433bf546d304-audit-dir\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.079797 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-config\") pod \"controller-manager-879f6c89f-4xznw\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.079910 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c6893b56-2395-4f91-9349-c23b48b957c8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-dkkh7\" (UID: \"c6893b56-2395-4f91-9349-c23b48b957c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.079926 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-549nc\" (UniqueName: \"kubernetes.io/projected/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-kube-api-access-549nc\") pod \"controller-manager-879f6c89f-4xznw\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.111183 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.111400 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.111659 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bnx6n"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.112231 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.085588 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c61cbc0b-441e-4704-accf-35963b3758aa-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.112889 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c61cbc0b-441e-4704-accf-35963b3758aa-etcd-client\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.112915 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bfa92863-23f8-42d4-8e73-433bf546d304-etcd-client\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.112941 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-client-ca\") pod \"controller-manager-879f6c89f-4xznw\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.112966 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c61cbc0b-441e-4704-accf-35963b3758aa-encryption-config\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.112988 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e76411a-c4c2-4822-9ec9-a7e73c15f7ec-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lg6jl\" (UID: \"8e76411a-c4c2-4822-9ec9-a7e73c15f7ec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113006 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c61cbc0b-441e-4704-accf-35963b3758aa-audit-dir\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113023 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5ghw\" (UniqueName: \"kubernetes.io/projected/bfa92863-23f8-42d4-8e73-433bf546d304-kube-api-access-q5ghw\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113051 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bfa92863-23f8-42d4-8e73-433bf546d304-node-pullsecrets\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113072 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5cc5125-93f0-4709-afbd-7aa6a888b641-config\") pod \"route-controller-manager-6576b87f9c-7snq7\" (UID: \"c5cc5125-93f0-4709-afbd-7aa6a888b641\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113091 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10940629-a0dc-4828-a913-20a754f4896b-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fhq98\" (UID: \"10940629-a0dc-4828-a913-20a754f4896b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113118 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzcz5\" (UniqueName: \"kubernetes.io/projected/c5cc5125-93f0-4709-afbd-7aa6a888b641-kube-api-access-fzcz5\") pod \"route-controller-manager-6576b87f9c-7snq7\" (UID: \"c5cc5125-93f0-4709-afbd-7aa6a888b641\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113140 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfa92863-23f8-42d4-8e73-433bf546d304-config\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113180 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e76411a-c4c2-4822-9ec9-a7e73c15f7ec-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lg6jl\" (UID: \"8e76411a-c4c2-4822-9ec9-a7e73c15f7ec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113200 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c6893b56-2395-4f91-9349-c23b48b957c8-images\") pod \"machine-api-operator-5694c8668f-dkkh7\" (UID: \"c6893b56-2395-4f91-9349-c23b48b957c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113223 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5cc5125-93f0-4709-afbd-7aa6a888b641-client-ca\") pod \"route-controller-manager-6576b87f9c-7snq7\" (UID: \"c5cc5125-93f0-4709-afbd-7aa6a888b641\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113242 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c61cbc0b-441e-4704-accf-35963b3758aa-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113264 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10940629-a0dc-4828-a913-20a754f4896b-serving-cert\") pod \"authentication-operator-69f744f599-fhq98\" (UID: \"10940629-a0dc-4828-a913-20a754f4896b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113282 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10940629-a0dc-4828-a913-20a754f4896b-service-ca-bundle\") pod \"authentication-operator-69f744f599-fhq98\" (UID: \"10940629-a0dc-4828-a913-20a754f4896b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113304 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bfa92863-23f8-42d4-8e73-433bf546d304-etcd-serving-ca\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113327 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-serving-cert\") pod \"controller-manager-879f6c89f-4xznw\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113346 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdzxf\" (UniqueName: \"kubernetes.io/projected/8e76411a-c4c2-4822-9ec9-a7e73c15f7ec-kube-api-access-sdzxf\") pod \"openshift-apiserver-operator-796bbdcf4f-lg6jl\" (UID: \"8e76411a-c4c2-4822-9ec9-a7e73c15f7ec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113364 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bfa92863-23f8-42d4-8e73-433bf546d304-encryption-config\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113385 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4xznw\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113409 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfa92863-23f8-42d4-8e73-433bf546d304-trusted-ca-bundle\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113432 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bfa92863-23f8-42d4-8e73-433bf546d304-image-import-ca\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113453 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c61cbc0b-441e-4704-accf-35963b3758aa-serving-cert\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113482 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10940629-a0dc-4828-a913-20a754f4896b-config\") pod \"authentication-operator-69f744f599-fhq98\" (UID: \"10940629-a0dc-4828-a913-20a754f4896b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113500 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfa92863-23f8-42d4-8e73-433bf546d304-serving-cert\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113520 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6893b56-2395-4f91-9349-c23b48b957c8-config\") pod \"machine-api-operator-5694c8668f-dkkh7\" (UID: \"c6893b56-2395-4f91-9349-c23b48b957c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113872 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.114907 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.115289 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.115475 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.116807 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.117012 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.117756 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.119588 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.120161 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.120452 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c8gbn"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.120754 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.120917 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.121030 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.124072 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.125288 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.125559 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.125988 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.126203 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.126355 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.127651 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.127750 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.127847 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.127961 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.128030 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.129711 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.130646 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.131012 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.131173 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.131335 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.132411 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.133161 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.133330 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-drs4q"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.133909 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.133989 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.134180 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.152881 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-xr24g"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.153522 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.153858 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.154090 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.154280 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-xr24g" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.154519 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.155813 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.157930 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.157956 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.159476 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.158468 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.158718 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.160412 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6swxn"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.164116 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.164950 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.165067 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.165170 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.165267 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.165394 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.165568 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.165681 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.165794 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.166098 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.178715 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.179182 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6swxn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.184409 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.184744 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.184747 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.187675 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.197449 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.197997 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.200062 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.202296 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wd77"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.202371 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.202616 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.202923 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wld5v"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.202854 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.203315 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wd77" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.203733 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2qwgb"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.203984 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wld5v" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.204171 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.204376 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.204627 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2qwgb" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.204804 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-shncx"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.205412 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.205580 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-dkkh7"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.205599 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-58c66"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.205612 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4qpfj"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.205839 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-shncx" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.206586 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.212530 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.214093 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p9hqz"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.214469 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.214590 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-fhq98"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.214608 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4xznw"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.215024 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.215917 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjfj6\" (UniqueName: \"kubernetes.io/projected/c6893b56-2395-4f91-9349-c23b48b957c8-kube-api-access-hjfj6\") pod \"machine-api-operator-5694c8668f-dkkh7\" (UID: \"c6893b56-2395-4f91-9349-c23b48b957c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.215946 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.215969 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df45t\" (UniqueName: \"kubernetes.io/projected/69b6d0bc-e512-432d-9a6f-f79318c0f571-kube-api-access-df45t\") pod \"cluster-image-registry-operator-dc59b4c8b-4cd9v\" (UID: \"69b6d0bc-e512-432d-9a6f-f79318c0f571\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.215987 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/452f8fcb-d31f-41d4-be85-d041d7efc756-serving-cert\") pod \"openshift-config-operator-7777fb866f-k5mg9\" (UID: \"452f8fcb-d31f-41d4-be85-d041d7efc756\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.216003 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pt9w\" (UniqueName: \"kubernetes.io/projected/10940629-a0dc-4828-a913-20a754f4896b-kube-api-access-7pt9w\") pod \"authentication-operator-69f744f599-fhq98\" (UID: \"10940629-a0dc-4828-a913-20a754f4896b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.216020 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bfa92863-23f8-42d4-8e73-433bf546d304-audit\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.216051 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d74nj\" (UniqueName: \"kubernetes.io/projected/62cfca3e-2ad8-4964-bd9a-5f907f09ca1e-kube-api-access-d74nj\") pod \"downloads-7954f5f757-z4s84\" (UID: \"62cfca3e-2ad8-4964-bd9a-5f907f09ca1e\") " pod="openshift-console/downloads-7954f5f757-z4s84" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.216068 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfkqd\" (UniqueName: \"kubernetes.io/projected/1d068555-56f2-4bcf-8b4c-cc574ad087fa-kube-api-access-nfkqd\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.216084 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.216106 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c61cbc0b-441e-4704-accf-35963b3758aa-audit-policies\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.216134 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-trusted-ca\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.216151 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz4bd\" (UniqueName: \"kubernetes.io/projected/c61cbc0b-441e-4704-accf-35963b3758aa-kube-api-access-tz4bd\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217222 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bfa92863-23f8-42d4-8e73-433bf546d304-audit-dir\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217248 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f9a6567-ebe5-4ba9-80ab-a2cd48818942-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mmgvw\" (UID: \"8f9a6567-ebe5-4ba9-80ab-a2cd48818942\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217265 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/69b6d0bc-e512-432d-9a6f-f79318c0f571-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4cd9v\" (UID: \"69b6d0bc-e512-432d-9a6f-f79318c0f571\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217283 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-config\") pod \"controller-manager-879f6c89f-4xznw\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217305 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfj7m\" (UniqueName: \"kubernetes.io/projected/452f8fcb-d31f-41d4-be85-d041d7efc756-kube-api-access-mfj7m\") pod \"openshift-config-operator-7777fb866f-k5mg9\" (UID: \"452f8fcb-d31f-41d4-be85-d041d7efc756\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217329 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c6893b56-2395-4f91-9349-c23b48b957c8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-dkkh7\" (UID: \"c6893b56-2395-4f91-9349-c23b48b957c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217350 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-549nc\" (UniqueName: \"kubernetes.io/projected/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-kube-api-access-549nc\") pod \"controller-manager-879f6c89f-4xznw\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217419 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00793875-21cf-4a6e-8da2-2d94bd3725c4-trusted-ca\") pod \"console-operator-58897d9998-2k6nd\" (UID: \"00793875-21cf-4a6e-8da2-2d94bd3725c4\") " pod="openshift-console-operator/console-operator-58897d9998-2k6nd" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217447 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtmqw\" (UniqueName: \"kubernetes.io/projected/2ab8d84d-9110-4bed-8288-4764d7c10f74-kube-api-access-rtmqw\") pod \"image-pruner-29548800-ghv4d\" (UID: \"2ab8d84d-9110-4bed-8288-4764d7c10f74\") " pod="openshift-image-registry/image-pruner-29548800-ghv4d" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217468 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfg7d\" (UniqueName: \"kubernetes.io/projected/00793875-21cf-4a6e-8da2-2d94bd3725c4-kube-api-access-hfg7d\") pod \"console-operator-58897d9998-2k6nd\" (UID: \"00793875-21cf-4a6e-8da2-2d94bd3725c4\") " pod="openshift-console-operator/console-operator-58897d9998-2k6nd" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217492 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-registry-tls\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217526 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c61cbc0b-441e-4704-accf-35963b3758aa-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217546 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f9a6567-ebe5-4ba9-80ab-a2cd48818942-config\") pod \"kube-apiserver-operator-766d6c64bb-mmgvw\" (UID: \"8f9a6567-ebe5-4ba9-80ab-a2cd48818942\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217572 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c61cbc0b-441e-4704-accf-35963b3758aa-etcd-client\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217595 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bfa92863-23f8-42d4-8e73-433bf546d304-etcd-client\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217615 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-client-ca\") pod \"controller-manager-879f6c89f-4xznw\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217639 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-bound-sa-token\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217664 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c61cbc0b-441e-4704-accf-35963b3758aa-encryption-config\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217753 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00793875-21cf-4a6e-8da2-2d94bd3725c4-config\") pod \"console-operator-58897d9998-2k6nd\" (UID: \"00793875-21cf-4a6e-8da2-2d94bd3725c4\") " pod="openshift-console-operator/console-operator-58897d9998-2k6nd" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217876 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bfa92863-23f8-42d4-8e73-433bf546d304-audit-dir\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.218517 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c61cbc0b-441e-4704-accf-35963b3758aa-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.218793 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c61cbc0b-441e-4704-accf-35963b3758aa-audit-policies\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.218888 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1d068555-56f2-4bcf-8b4c-cc574ad087fa-console-serving-cert\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.218911 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1d068555-56f2-4bcf-8b4c-cc574ad087fa-service-ca\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.218957 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e76411a-c4c2-4822-9ec9-a7e73c15f7ec-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lg6jl\" (UID: \"8e76411a-c4c2-4822-9ec9-a7e73c15f7ec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.218987 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c61cbc0b-441e-4704-accf-35963b3758aa-audit-dir\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.219002 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5ghw\" (UniqueName: \"kubernetes.io/projected/bfa92863-23f8-42d4-8e73-433bf546d304-kube-api-access-q5ghw\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.219043 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.219623 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e76411a-c4c2-4822-9ec9-a7e73c15f7ec-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lg6jl\" (UID: \"8e76411a-c4c2-4822-9ec9-a7e73c15f7ec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.219654 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c61cbc0b-441e-4704-accf-35963b3758aa-audit-dir\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.219888 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-client-ca\") pod \"controller-manager-879f6c89f-4xznw\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:28 crc kubenswrapper[4713]: E0308 00:09:28.219917 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:28.719905564 +0000 UTC m=+222.839537797 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.220024 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bfa92863-23f8-42d4-8e73-433bf546d304-node-pullsecrets\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.220030 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-config\") pod \"controller-manager-879f6c89f-4xznw\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.220053 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/69b6d0bc-e512-432d-9a6f-f79318c0f571-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4cd9v\" (UID: \"69b6d0bc-e512-432d-9a6f-f79318c0f571\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.220079 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bfa92863-23f8-42d4-8e73-433bf546d304-node-pullsecrets\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.220129 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5cc5125-93f0-4709-afbd-7aa6a888b641-config\") pod \"route-controller-manager-6576b87f9c-7snq7\" (UID: \"c5cc5125-93f0-4709-afbd-7aa6a888b641\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.220258 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bfa92863-23f8-42d4-8e73-433bf546d304-audit\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.220298 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10940629-a0dc-4828-a913-20a754f4896b-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fhq98\" (UID: \"10940629-a0dc-4828-a913-20a754f4896b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.220377 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/452f8fcb-d31f-41d4-be85-d041d7efc756-available-featuregates\") pod \"openshift-config-operator-7777fb866f-k5mg9\" (UID: \"452f8fcb-d31f-41d4-be85-d041d7efc756\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.220488 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f9a6567-ebe5-4ba9-80ab-a2cd48818942-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mmgvw\" (UID: \"8f9a6567-ebe5-4ba9-80ab-a2cd48818942\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.220539 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzcz5\" (UniqueName: \"kubernetes.io/projected/c5cc5125-93f0-4709-afbd-7aa6a888b641-kube-api-access-fzcz5\") pod \"route-controller-manager-6576b87f9c-7snq7\" (UID: \"c5cc5125-93f0-4709-afbd-7aa6a888b641\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.220574 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-registry-certificates\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.220647 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfa92863-23f8-42d4-8e73-433bf546d304-config\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.221137 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfa92863-23f8-42d4-8e73-433bf546d304-config\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.221170 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e76411a-c4c2-4822-9ec9-a7e73c15f7ec-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lg6jl\" (UID: \"8e76411a-c4c2-4822-9ec9-a7e73c15f7ec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.221208 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c6893b56-2395-4f91-9349-c23b48b957c8-images\") pod \"machine-api-operator-5694c8668f-dkkh7\" (UID: \"c6893b56-2395-4f91-9349-c23b48b957c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.221505 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5cc5125-93f0-4709-afbd-7aa6a888b641-config\") pod \"route-controller-manager-6576b87f9c-7snq7\" (UID: \"c5cc5125-93f0-4709-afbd-7aa6a888b641\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.222107 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5cc5125-93f0-4709-afbd-7aa6a888b641-client-ca\") pod \"route-controller-manager-6576b87f9c-7snq7\" (UID: \"c5cc5125-93f0-4709-afbd-7aa6a888b641\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.222105 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c6893b56-2395-4f91-9349-c23b48b957c8-images\") pod \"machine-api-operator-5694c8668f-dkkh7\" (UID: \"c6893b56-2395-4f91-9349-c23b48b957c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.222185 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c61cbc0b-441e-4704-accf-35963b3758aa-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.222307 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10940629-a0dc-4828-a913-20a754f4896b-serving-cert\") pod \"authentication-operator-69f744f599-fhq98\" (UID: \"10940629-a0dc-4828-a913-20a754f4896b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.222325 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10940629-a0dc-4828-a913-20a754f4896b-service-ca-bundle\") pod \"authentication-operator-69f744f599-fhq98\" (UID: \"10940629-a0dc-4828-a913-20a754f4896b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.222521 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/69b6d0bc-e512-432d-9a6f-f79318c0f571-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4cd9v\" (UID: \"69b6d0bc-e512-432d-9a6f-f79318c0f571\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.223139 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c61cbc0b-441e-4704-accf-35963b3758aa-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.223204 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bfa92863-23f8-42d4-8e73-433bf546d304-etcd-serving-ca\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.223938 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bfa92863-23f8-42d4-8e73-433bf546d304-etcd-serving-ca\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.223981 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-serving-cert\") pod \"controller-manager-879f6c89f-4xznw\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.224028 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1d068555-56f2-4bcf-8b4c-cc574ad087fa-oauth-serving-cert\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.224096 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdzxf\" (UniqueName: \"kubernetes.io/projected/8e76411a-c4c2-4822-9ec9-a7e73c15f7ec-kube-api-access-sdzxf\") pod \"openshift-apiserver-operator-796bbdcf4f-lg6jl\" (UID: \"8e76411a-c4c2-4822-9ec9-a7e73c15f7ec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.225338 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10940629-a0dc-4828-a913-20a754f4896b-service-ca-bundle\") pod \"authentication-operator-69f744f599-fhq98\" (UID: \"10940629-a0dc-4828-a913-20a754f4896b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.226188 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c61cbc0b-441e-4704-accf-35963b3758aa-encryption-config\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.226258 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c61cbc0b-441e-4704-accf-35963b3758aa-etcd-client\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.226294 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bfa92863-23f8-42d4-8e73-433bf546d304-encryption-config\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.226318 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4xznw\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.226321 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bfa92863-23f8-42d4-8e73-433bf546d304-etcd-client\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.226370 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1d068555-56f2-4bcf-8b4c-cc574ad087fa-console-oauth-config\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.226412 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1d068555-56f2-4bcf-8b4c-cc574ad087fa-console-config\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.226433 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfa92863-23f8-42d4-8e73-433bf546d304-trusted-ca-bundle\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.226516 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bfa92863-23f8-42d4-8e73-433bf546d304-image-import-ca\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.226835 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2ab8d84d-9110-4bed-8288-4764d7c10f74-serviceca\") pod \"image-pruner-29548800-ghv4d\" (UID: \"2ab8d84d-9110-4bed-8288-4764d7c10f74\") " pod="openshift-image-registry/image-pruner-29548800-ghv4d" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.226887 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c61cbc0b-441e-4704-accf-35963b3758aa-serving-cert\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.226932 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk5fw\" (UniqueName: \"kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-kube-api-access-gk5fw\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.226981 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d068555-56f2-4bcf-8b4c-cc574ad087fa-trusted-ca-bundle\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.227006 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10940629-a0dc-4828-a913-20a754f4896b-config\") pod \"authentication-operator-69f744f599-fhq98\" (UID: \"10940629-a0dc-4828-a913-20a754f4896b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.227022 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfa92863-23f8-42d4-8e73-433bf546d304-serving-cert\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.227037 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6893b56-2395-4f91-9349-c23b48b957c8-config\") pod \"machine-api-operator-5694c8668f-dkkh7\" (UID: \"c6893b56-2395-4f91-9349-c23b48b957c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.227061 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5cc5125-93f0-4709-afbd-7aa6a888b641-serving-cert\") pod \"route-controller-manager-6576b87f9c-7snq7\" (UID: \"c5cc5125-93f0-4709-afbd-7aa6a888b641\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.227098 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00793875-21cf-4a6e-8da2-2d94bd3725c4-serving-cert\") pod \"console-operator-58897d9998-2k6nd\" (UID: \"00793875-21cf-4a6e-8da2-2d94bd3725c4\") " pod="openshift-console-operator/console-operator-58897d9998-2k6nd" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.227611 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfa92863-23f8-42d4-8e73-433bf546d304-trusted-ca-bundle\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.227819 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e76411a-c4c2-4822-9ec9-a7e73c15f7ec-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lg6jl\" (UID: \"8e76411a-c4c2-4822-9ec9-a7e73c15f7ec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.228689 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bfa92863-23f8-42d4-8e73-433bf546d304-image-import-ca\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.228771 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10940629-a0dc-4828-a913-20a754f4896b-config\") pod \"authentication-operator-69f744f599-fhq98\" (UID: \"10940629-a0dc-4828-a913-20a754f4896b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.228855 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6893b56-2395-4f91-9349-c23b48b957c8-config\") pod \"machine-api-operator-5694c8668f-dkkh7\" (UID: \"c6893b56-2395-4f91-9349-c23b48b957c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.229042 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5cc5125-93f0-4709-afbd-7aa6a888b641-client-ca\") pod \"route-controller-manager-6576b87f9c-7snq7\" (UID: \"c5cc5125-93f0-4709-afbd-7aa6a888b641\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.229095 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10940629-a0dc-4828-a913-20a754f4896b-serving-cert\") pod \"authentication-operator-69f744f599-fhq98\" (UID: \"10940629-a0dc-4828-a913-20a754f4896b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.229476 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.230019 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10940629-a0dc-4828-a913-20a754f4896b-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fhq98\" (UID: \"10940629-a0dc-4828-a913-20a754f4896b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.231179 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4xznw\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.232431 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.233247 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfa92863-23f8-42d4-8e73-433bf546d304-serving-cert\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.232762 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c61cbc0b-441e-4704-accf-35963b3758aa-serving-cert\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.232868 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.232560 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bfa92863-23f8-42d4-8e73-433bf546d304-encryption-config\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.233691 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.233949 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.234588 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5cc5125-93f0-4709-afbd-7aa6a888b641-serving-cert\") pod \"route-controller-manager-6576b87f9c-7snq7\" (UID: \"c5cc5125-93f0-4709-afbd-7aa6a888b641\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.234891 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.235930 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.236562 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-serving-cert\") pod \"controller-manager-879f6c89f-4xznw\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.237999 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548808-nd57l"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.238481 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548808-nd57l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.240011 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.241876 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.242310 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c6893b56-2395-4f91-9349-c23b48b957c8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-dkkh7\" (UID: \"c6893b56-2395-4f91-9349-c23b48b957c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.242886 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.245495 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.246735 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.249226 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.252073 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.255450 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.260451 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-c4nq5"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.262369 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-c4nq5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.268109 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.270476 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-5bltg"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.272455 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5bltg" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.274741 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.277583 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29548800-ghv4d"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.280060 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.282538 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2k6nd"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.289131 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.292074 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-xr24g"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.292370 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6swxn"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.294183 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-z4s84"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.294288 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.295027 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.296209 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.297748 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-lwhnh"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.298618 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lwhnh" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.298910 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-xmjhj"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.299604 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xmjhj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.299762 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-sxbdk"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.301126 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-sxbdk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.301343 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4qpfj"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.302638 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bnx6n"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.303865 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.305162 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.306341 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c8gbn"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.307493 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.308674 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.309907 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-gk97q"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.310995 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lwhnh"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.312077 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-shncx"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.312914 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.313177 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.314408 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.315747 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wd77"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.316956 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.318234 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.319555 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wld5v"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.320685 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2qwgb"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.323637 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p9hqz"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.323690 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xmjhj"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.326236 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548808-nd57l"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328137 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328291 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5eb834dd-5358-45c4-bbca-50baf0e8656b-srv-cert\") pod \"catalog-operator-68c6474976-bn56j\" (UID: \"5eb834dd-5358-45c4-bbca-50baf0e8656b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" Mar 08 00:09:28 crc kubenswrapper[4713]: E0308 00:09:28.328327 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:28.828302728 +0000 UTC m=+222.947934961 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328383 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328427 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmrdb\" (UniqueName: \"kubernetes.io/projected/2be1cb07-55b6-4220-989e-13415c3156b2-kube-api-access-kmrdb\") pod \"openshift-controller-manager-operator-756b6f6bc6-pvc8t\" (UID: \"2be1cb07-55b6-4220-989e-13415c3156b2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328495 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/69b6d0bc-e512-432d-9a6f-f79318c0f571-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4cd9v\" (UID: \"69b6d0bc-e512-432d-9a6f-f79318c0f571\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328533 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328567 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/548e19ee-14eb-4075-b9e3-69178800837c-stats-auth\") pod \"router-default-5444994796-drs4q\" (UID: \"548e19ee-14eb-4075-b9e3-69178800837c\") " pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328597 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1d068555-56f2-4bcf-8b4c-cc574ad087fa-oauth-serving-cert\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328619 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm5dw\" (UniqueName: \"kubernetes.io/projected/5eb834dd-5358-45c4-bbca-50baf0e8656b-kube-api-access-wm5dw\") pod \"catalog-operator-68c6474976-bn56j\" (UID: \"5eb834dd-5358-45c4-bbca-50baf0e8656b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328654 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/548e19ee-14eb-4075-b9e3-69178800837c-default-certificate\") pod \"router-default-5444994796-drs4q\" (UID: \"548e19ee-14eb-4075-b9e3-69178800837c\") " pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328682 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/141fc694-b9ce-4b84-9e39-0e79a487e398-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zvsbq\" (UID: \"141fc694-b9ce-4b84-9e39-0e79a487e398\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328709 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1d068555-56f2-4bcf-8b4c-cc574ad087fa-console-config\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328730 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/548e19ee-14eb-4075-b9e3-69178800837c-service-ca-bundle\") pod \"router-default-5444994796-drs4q\" (UID: \"548e19ee-14eb-4075-b9e3-69178800837c\") " pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328753 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328795 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-etcd-ca\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328846 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00793875-21cf-4a6e-8da2-2d94bd3725c4-serving-cert\") pod \"console-operator-58897d9998-2k6nd\" (UID: \"00793875-21cf-4a6e-8da2-2d94bd3725c4\") " pod="openshift-console-operator/console-operator-58897d9998-2k6nd" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328872 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328894 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/452f8fcb-d31f-41d4-be85-d041d7efc756-serving-cert\") pod \"openshift-config-operator-7777fb866f-k5mg9\" (UID: \"452f8fcb-d31f-41d4-be85-d041d7efc756\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328920 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6qkt\" (UniqueName: \"kubernetes.io/projected/fd936d68-81ed-4923-8078-5ad0116d532e-kube-api-access-j6qkt\") pod \"migrator-59844c95c7-wld5v\" (UID: \"fd936d68-81ed-4923-8078-5ad0116d532e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wld5v" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328956 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jhxcl\" (UID: \"cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328984 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-trusted-ca\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329010 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dspc4\" (UniqueName: \"kubernetes.io/projected/3a74e1e8-3928-4220-b55d-ee42585ef1ee-kube-api-access-dspc4\") pod \"cluster-samples-operator-665b6dd947-6swxn\" (UID: \"3a74e1e8-3928-4220-b55d-ee42585ef1ee\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6swxn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329047 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/69b6d0bc-e512-432d-9a6f-f79318c0f571-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4cd9v\" (UID: \"69b6d0bc-e512-432d-9a6f-f79318c0f571\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329075 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfj7m\" (UniqueName: \"kubernetes.io/projected/452f8fcb-d31f-41d4-be85-d041d7efc756-kube-api-access-mfj7m\") pod \"openshift-config-operator-7777fb866f-k5mg9\" (UID: \"452f8fcb-d31f-41d4-be85-d041d7efc756\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329101 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329139 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtmqw\" (UniqueName: \"kubernetes.io/projected/2ab8d84d-9110-4bed-8288-4764d7c10f74-kube-api-access-rtmqw\") pod \"image-pruner-29548800-ghv4d\" (UID: \"2ab8d84d-9110-4bed-8288-4764d7c10f74\") " pod="openshift-image-registry/image-pruner-29548800-ghv4d" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329163 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c9f8ace1-247f-4128-b3f7-95037fb1a156-machine-approver-tls\") pod \"machine-approver-56656f9798-tdq97\" (UID: \"c9f8ace1-247f-4128-b3f7-95037fb1a156\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329188 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d3811a82-b0fe-4e06-948a-79cbbc840a98-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bltk5\" (UID: \"d3811a82-b0fe-4e06-948a-79cbbc840a98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329225 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfg7d\" (UniqueName: \"kubernetes.io/projected/00793875-21cf-4a6e-8da2-2d94bd3725c4-kube-api-access-hfg7d\") pod \"console-operator-58897d9998-2k6nd\" (UID: \"00793875-21cf-4a6e-8da2-2d94bd3725c4\") " pod="openshift-console-operator/console-operator-58897d9998-2k6nd" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329249 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-registry-tls\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329273 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-h5mxt\" (UID: \"d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329299 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p77q9\" (UniqueName: \"kubernetes.io/projected/6e21b584-0781-4fa9-8811-332d42755c17-kube-api-access-p77q9\") pod \"machine-config-controller-84d6567774-shncx\" (UID: \"6e21b584-0781-4fa9-8811-332d42755c17\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-shncx" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329322 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c9f8ace1-247f-4128-b3f7-95037fb1a156-auth-proxy-config\") pod \"machine-approver-56656f9798-tdq97\" (UID: \"c9f8ace1-247f-4128-b3f7-95037fb1a156\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329353 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a74e1e8-3928-4220-b55d-ee42585ef1ee-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-6swxn\" (UID: \"3a74e1e8-3928-4220-b55d-ee42585ef1ee\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6swxn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329376 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8lcv\" (UniqueName: \"kubernetes.io/projected/c9f8ace1-247f-4128-b3f7-95037fb1a156-kube-api-access-w8lcv\") pod \"machine-approver-56656f9798-tdq97\" (UID: \"c9f8ace1-247f-4128-b3f7-95037fb1a156\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329399 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9fed4c23-4a16-4502-87eb-d1dd68aa1af5-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2qwgb\" (UID: \"9fed4c23-4a16-4502-87eb-d1dd68aa1af5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2qwgb" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329422 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxfnr\" (UniqueName: \"kubernetes.io/projected/9fed4c23-4a16-4502-87eb-d1dd68aa1af5-kube-api-access-qxfnr\") pod \"multus-admission-controller-857f4d67dd-2qwgb\" (UID: \"9fed4c23-4a16-4502-87eb-d1dd68aa1af5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2qwgb" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329451 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-bound-sa-token\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329454 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1d068555-56f2-4bcf-8b4c-cc574ad087fa-oauth-serving-cert\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329505 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1d068555-56f2-4bcf-8b4c-cc574ad087fa-console-serving-cert\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329523 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1d068555-56f2-4bcf-8b4c-cc574ad087fa-service-ca\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329550 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329580 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329597 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-etcd-service-ca\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329615 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g45bj\" (UniqueName: \"kubernetes.io/projected/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-kube-api-access-g45bj\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329636 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/69b6d0bc-e512-432d-9a6f-f79318c0f571-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4cd9v\" (UID: \"69b6d0bc-e512-432d-9a6f-f79318c0f571\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329656 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-795x2\" (UniqueName: \"kubernetes.io/projected/9e570b68-8b4c-42e3-839d-f37943999246-kube-api-access-795x2\") pod \"marketplace-operator-79b997595-p9hqz\" (UID: \"9e570b68-8b4c-42e3-839d-f37943999246\") " pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329688 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/452f8fcb-d31f-41d4-be85-d041d7efc756-available-featuregates\") pod \"openshift-config-operator-7777fb866f-k5mg9\" (UID: \"452f8fcb-d31f-41d4-be85-d041d7efc756\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329706 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ccf0e825-0465-40ae-b0ca-f4f7c377e518-metrics-tls\") pod \"dns-operator-744455d44c-xr24g\" (UID: \"ccf0e825-0465-40ae-b0ca-f4f7c377e518\") " pod="openshift-dns-operator/dns-operator-744455d44c-xr24g" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329724 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6wlf\" (UniqueName: \"kubernetes.io/projected/141fc694-b9ce-4b84-9e39-0e79a487e398-kube-api-access-j6wlf\") pod \"kube-storage-version-migrator-operator-b67b599dd-zvsbq\" (UID: \"141fc694-b9ce-4b84-9e39-0e79a487e398\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329749 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tch6h\" (UniqueName: \"kubernetes.io/projected/496a4fbf-c338-4b64-96a5-dda456094c28-kube-api-access-tch6h\") pod \"machine-config-operator-74547568cd-q7bjv\" (UID: \"496a4fbf-c338-4b64-96a5-dda456094c28\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329768 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p9hqz\" (UID: \"9e570b68-8b4c-42e3-839d-f37943999246\") " pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329789 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp6ps\" (UniqueName: \"kubernetes.io/projected/c9df8d9c-b59f-4a1c-9fb4-668123290569-kube-api-access-mp6ps\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329844 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5eb834dd-5358-45c4-bbca-50baf0e8656b-profile-collector-cert\") pod \"catalog-operator-68c6474976-bn56j\" (UID: \"5eb834dd-5358-45c4-bbca-50baf0e8656b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329864 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k27jn\" (UniqueName: \"kubernetes.io/projected/d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4-kube-api-access-k27jn\") pod \"package-server-manager-789f6589d5-h5mxt\" (UID: \"d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329882 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/548e19ee-14eb-4075-b9e3-69178800837c-metrics-certs\") pod \"router-default-5444994796-drs4q\" (UID: \"548e19ee-14eb-4075-b9e3-69178800837c\") " pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329903 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c9df8d9c-b59f-4a1c-9fb4-668123290569-audit-dir\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329923 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9f8ace1-247f-4128-b3f7-95037fb1a156-config\") pod \"machine-approver-56656f9798-tdq97\" (UID: \"c9f8ace1-247f-4128-b3f7-95037fb1a156\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329939 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1d068555-56f2-4bcf-8b4c-cc574ad087fa-console-oauth-config\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329956 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dbf7b38-8980-49e5-956c-08e443912846-config\") pod \"kube-controller-manager-operator-78b949d7b-4p529\" (UID: \"0dbf7b38-8980-49e5-956c-08e443912846\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329975 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2ab8d84d-9110-4bed-8288-4764d7c10f74-serviceca\") pod \"image-pruner-29548800-ghv4d\" (UID: \"2ab8d84d-9110-4bed-8288-4764d7c10f74\") " pod="openshift-image-registry/image-pruner-29548800-ghv4d" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329994 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d587l\" (UniqueName: \"kubernetes.io/projected/d3811a82-b0fe-4e06-948a-79cbbc840a98-kube-api-access-d587l\") pod \"ingress-operator-5b745b69d9-bltk5\" (UID: \"d3811a82-b0fe-4e06-948a-79cbbc840a98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330012 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk5fw\" (UniqueName: \"kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-kube-api-access-gk5fw\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330066 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jhxcl\" (UID: \"cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330088 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d068555-56f2-4bcf-8b4c-cc574ad087fa-trusted-ca-bundle\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330110 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jhxcl\" (UID: \"cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330164 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e21b584-0781-4fa9-8811-332d42755c17-proxy-tls\") pod \"machine-config-controller-84d6567774-shncx\" (UID: \"6e21b584-0781-4fa9-8811-332d42755c17\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-shncx" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330196 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-config\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330234 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0dbf7b38-8980-49e5-956c-08e443912846-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4p529\" (UID: \"0dbf7b38-8980-49e5-956c-08e443912846\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330531 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df45t\" (UniqueName: \"kubernetes.io/projected/69b6d0bc-e512-432d-9a6f-f79318c0f571-kube-api-access-df45t\") pod \"cluster-image-registry-operator-dc59b4c8b-4cd9v\" (UID: \"69b6d0bc-e512-432d-9a6f-f79318c0f571\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330560 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/496a4fbf-c338-4b64-96a5-dda456094c28-auth-proxy-config\") pod \"machine-config-operator-74547568cd-q7bjv\" (UID: \"496a4fbf-c338-4b64-96a5-dda456094c28\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330584 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0d2f415a-2626-45f9-baf0-68ab25b9d079-srv-cert\") pod \"olm-operator-6b444d44fb-8m94r\" (UID: \"0d2f415a-2626-45f9-baf0-68ab25b9d079\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330645 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d74nj\" (UniqueName: \"kubernetes.io/projected/62cfca3e-2ad8-4964-bd9a-5f907f09ca1e-kube-api-access-d74nj\") pod \"downloads-7954f5f757-z4s84\" (UID: \"62cfca3e-2ad8-4964-bd9a-5f907f09ca1e\") " pod="openshift-console/downloads-7954f5f757-z4s84" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330672 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfkqd\" (UniqueName: \"kubernetes.io/projected/1d068555-56f2-4bcf-8b4c-cc574ad087fa-kube-api-access-nfkqd\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330702 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330726 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6e21b584-0781-4fa9-8811-332d42755c17-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-shncx\" (UID: \"6e21b584-0781-4fa9-8811-332d42755c17\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-shncx" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330319 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/69b6d0bc-e512-432d-9a6f-f79318c0f571-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4cd9v\" (UID: \"69b6d0bc-e512-432d-9a6f-f79318c0f571\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330798 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f9a6567-ebe5-4ba9-80ab-a2cd48818942-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mmgvw\" (UID: \"8f9a6567-ebe5-4ba9-80ab-a2cd48818942\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330855 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-5bltg"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330879 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-q84x9"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.331175 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.331212 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.331249 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00793875-21cf-4a6e-8da2-2d94bd3725c4-trusted-ca\") pod \"console-operator-58897d9998-2k6nd\" (UID: \"00793875-21cf-4a6e-8da2-2d94bd3725c4\") " pod="openshift-console-operator/console-operator-58897d9998-2k6nd" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.331272 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f878574f-5b4a-4a3f-9b2b-e8eeb569f0fc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-7wd77\" (UID: \"f878574f-5b4a-4a3f-9b2b-e8eeb569f0fc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wd77" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.331303 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-serving-cert\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.331326 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/141fc694-b9ce-4b84-9e39-0e79a487e398-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zvsbq\" (UID: \"141fc694-b9ce-4b84-9e39-0e79a487e398\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.331395 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f9a6567-ebe5-4ba9-80ab-a2cd48818942-config\") pod \"kube-apiserver-operator-766d6c64bb-mmgvw\" (UID: \"8f9a6567-ebe5-4ba9-80ab-a2cd48818942\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.331393 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.331464 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1d068555-56f2-4bcf-8b4c-cc574ad087fa-console-config\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332001 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/496a4fbf-c338-4b64-96a5-dda456094c28-proxy-tls\") pod \"machine-config-operator-74547568cd-q7bjv\" (UID: \"496a4fbf-c338-4b64-96a5-dda456094c28\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332050 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmkds\" (UniqueName: \"kubernetes.io/projected/548e19ee-14eb-4075-b9e3-69178800837c-kube-api-access-wmkds\") pod \"router-default-5444994796-drs4q\" (UID: \"548e19ee-14eb-4075-b9e3-69178800837c\") " pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332138 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: E0308 00:09:28.332170 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:28.832138434 +0000 UTC m=+222.951770737 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332207 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nk4f\" (UniqueName: \"kubernetes.io/projected/f878574f-5b4a-4a3f-9b2b-e8eeb569f0fc-kube-api-access-7nk4f\") pod \"control-plane-machine-set-operator-78cbb6b69f-7wd77\" (UID: \"f878574f-5b4a-4a3f-9b2b-e8eeb569f0fc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wd77" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332225 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332271 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be1cb07-55b6-4220-989e-13415c3156b2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-pvc8t\" (UID: \"2be1cb07-55b6-4220-989e-13415c3156b2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332358 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3811a82-b0fe-4e06-948a-79cbbc840a98-metrics-tls\") pod \"ingress-operator-5b745b69d9-bltk5\" (UID: \"d3811a82-b0fe-4e06-948a-79cbbc840a98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332466 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrkff\" (UniqueName: \"kubernetes.io/projected/fdccd72c-79d7-4388-926e-0539c571dafe-kube-api-access-hrkff\") pod \"auto-csr-approver-29548808-nd57l\" (UID: \"fdccd72c-79d7-4388-926e-0539c571dafe\") " pod="openshift-infra/auto-csr-approver-29548808-nd57l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332496 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-audit-policies\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332599 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0dbf7b38-8980-49e5-956c-08e443912846-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4p529\" (UID: \"0dbf7b38-8980-49e5-956c-08e443912846\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332642 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332673 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00793875-21cf-4a6e-8da2-2d94bd3725c4-config\") pod \"console-operator-58897d9998-2k6nd\" (UID: \"00793875-21cf-4a6e-8da2-2d94bd3725c4\") " pod="openshift-console-operator/console-operator-58897d9998-2k6nd" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332696 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d3811a82-b0fe-4e06-948a-79cbbc840a98-trusted-ca\") pod \"ingress-operator-5b745b69d9-bltk5\" (UID: \"d3811a82-b0fe-4e06-948a-79cbbc840a98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332759 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p9hqz\" (UID: \"9e570b68-8b4c-42e3-839d-f37943999246\") " pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332811 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0d2f415a-2626-45f9-baf0-68ab25b9d079-profile-collector-cert\") pod \"olm-operator-6b444d44fb-8m94r\" (UID: \"0d2f415a-2626-45f9-baf0-68ab25b9d079\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332875 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rckjk\" (UniqueName: \"kubernetes.io/projected/ccf0e825-0465-40ae-b0ca-f4f7c377e518-kube-api-access-rckjk\") pod \"dns-operator-744455d44c-xr24g\" (UID: \"ccf0e825-0465-40ae-b0ca-f4f7c377e518\") " pod="openshift-dns-operator/dns-operator-744455d44c-xr24g" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332896 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332921 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f9a6567-ebe5-4ba9-80ab-a2cd48818942-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mmgvw\" (UID: \"8f9a6567-ebe5-4ba9-80ab-a2cd48818942\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332973 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/496a4fbf-c338-4b64-96a5-dda456094c28-images\") pod \"machine-config-operator-74547568cd-q7bjv\" (UID: \"496a4fbf-c338-4b64-96a5-dda456094c28\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.333051 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9tmn\" (UniqueName: \"kubernetes.io/projected/0d2f415a-2626-45f9-baf0-68ab25b9d079-kube-api-access-l9tmn\") pod \"olm-operator-6b444d44fb-8m94r\" (UID: \"0d2f415a-2626-45f9-baf0-68ab25b9d079\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.333104 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-registry-certificates\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.333140 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-etcd-client\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.333192 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.333267 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2be1cb07-55b6-4220-989e-13415c3156b2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-pvc8t\" (UID: \"2be1cb07-55b6-4220-989e-13415c3156b2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.333489 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-c4nq5"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.333910 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.334104 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/452f8fcb-d31f-41d4-be85-d041d7efc756-serving-cert\") pod \"openshift-config-operator-7777fb866f-k5mg9\" (UID: \"452f8fcb-d31f-41d4-be85-d041d7efc756\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.334322 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f9a6567-ebe5-4ba9-80ab-a2cd48818942-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mmgvw\" (UID: \"8f9a6567-ebe5-4ba9-80ab-a2cd48818942\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.334704 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1d068555-56f2-4bcf-8b4c-cc574ad087fa-service-ca\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.335214 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2ab8d84d-9110-4bed-8288-4764d7c10f74-serviceca\") pod \"image-pruner-29548800-ghv4d\" (UID: \"2ab8d84d-9110-4bed-8288-4764d7c10f74\") " pod="openshift-image-registry/image-pruner-29548800-ghv4d" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.335534 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00793875-21cf-4a6e-8da2-2d94bd3725c4-serving-cert\") pod \"console-operator-58897d9998-2k6nd\" (UID: \"00793875-21cf-4a6e-8da2-2d94bd3725c4\") " pod="openshift-console-operator/console-operator-58897d9998-2k6nd" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.335774 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.335986 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1d068555-56f2-4bcf-8b4c-cc574ad087fa-console-serving-cert\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.336136 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00793875-21cf-4a6e-8da2-2d94bd3725c4-trusted-ca\") pod \"console-operator-58897d9998-2k6nd\" (UID: \"00793875-21cf-4a6e-8da2-2d94bd3725c4\") " pod="openshift-console-operator/console-operator-58897d9998-2k6nd" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.336498 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/69b6d0bc-e512-432d-9a6f-f79318c0f571-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4cd9v\" (UID: \"69b6d0bc-e512-432d-9a6f-f79318c0f571\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.336694 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f9a6567-ebe5-4ba9-80ab-a2cd48818942-config\") pod \"kube-apiserver-operator-766d6c64bb-mmgvw\" (UID: \"8f9a6567-ebe5-4ba9-80ab-a2cd48818942\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.336834 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-trusted-ca\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.336895 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/452f8fcb-d31f-41d4-be85-d041d7efc756-available-featuregates\") pod \"openshift-config-operator-7777fb866f-k5mg9\" (UID: \"452f8fcb-d31f-41d4-be85-d041d7efc756\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.336905 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-q84x9"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.337127 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d068555-56f2-4bcf-8b4c-cc574ad087fa-trusted-ca-bundle\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.337442 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-registry-certificates\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.337543 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00793875-21cf-4a6e-8da2-2d94bd3725c4-config\") pod \"console-operator-58897d9998-2k6nd\" (UID: \"00793875-21cf-4a6e-8da2-2d94bd3725c4\") " pod="openshift-console-operator/console-operator-58897d9998-2k6nd" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.338920 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1d068555-56f2-4bcf-8b4c-cc574ad087fa-console-oauth-config\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.339545 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-registry-tls\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.353074 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.373817 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.393927 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.413597 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.433523 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.434404 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.434595 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rckjk\" (UniqueName: \"kubernetes.io/projected/ccf0e825-0465-40ae-b0ca-f4f7c377e518-kube-api-access-rckjk\") pod \"dns-operator-744455d44c-xr24g\" (UID: \"ccf0e825-0465-40ae-b0ca-f4f7c377e518\") " pod="openshift-dns-operator/dns-operator-744455d44c-xr24g" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.434702 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.434761 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/496a4fbf-c338-4b64-96a5-dda456094c28-images\") pod \"machine-config-operator-74547568cd-q7bjv\" (UID: \"496a4fbf-c338-4b64-96a5-dda456094c28\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.434788 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9tmn\" (UniqueName: \"kubernetes.io/projected/0d2f415a-2626-45f9-baf0-68ab25b9d079-kube-api-access-l9tmn\") pod \"olm-operator-6b444d44fb-8m94r\" (UID: \"0d2f415a-2626-45f9-baf0-68ab25b9d079\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" Mar 08 00:09:28 crc kubenswrapper[4713]: E0308 00:09:28.435166 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:28.935119812 +0000 UTC m=+223.054752045 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435208 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-etcd-client\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435250 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435289 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2be1cb07-55b6-4220-989e-13415c3156b2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-pvc8t\" (UID: \"2be1cb07-55b6-4220-989e-13415c3156b2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435315 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/063a79dd-fbe8-4562-98bc-deb309b25182-mountpoint-dir\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435340 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5eb834dd-5358-45c4-bbca-50baf0e8656b-srv-cert\") pod \"catalog-operator-68c6474976-bn56j\" (UID: \"5eb834dd-5358-45c4-bbca-50baf0e8656b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435372 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435420 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmrdb\" (UniqueName: \"kubernetes.io/projected/2be1cb07-55b6-4220-989e-13415c3156b2-kube-api-access-kmrdb\") pod \"openshift-controller-manager-operator-756b6f6bc6-pvc8t\" (UID: \"2be1cb07-55b6-4220-989e-13415c3156b2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435443 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435469 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/548e19ee-14eb-4075-b9e3-69178800837c-stats-auth\") pod \"router-default-5444994796-drs4q\" (UID: \"548e19ee-14eb-4075-b9e3-69178800837c\") " pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435490 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm5dw\" (UniqueName: \"kubernetes.io/projected/5eb834dd-5358-45c4-bbca-50baf0e8656b-kube-api-access-wm5dw\") pod \"catalog-operator-68c6474976-bn56j\" (UID: \"5eb834dd-5358-45c4-bbca-50baf0e8656b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435549 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/063a79dd-fbe8-4562-98bc-deb309b25182-socket-dir\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435574 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/548e19ee-14eb-4075-b9e3-69178800837c-default-certificate\") pod \"router-default-5444994796-drs4q\" (UID: \"548e19ee-14eb-4075-b9e3-69178800837c\") " pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435593 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/141fc694-b9ce-4b84-9e39-0e79a487e398-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zvsbq\" (UID: \"141fc694-b9ce-4b84-9e39-0e79a487e398\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435608 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/063a79dd-fbe8-4562-98bc-deb309b25182-registration-dir\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435624 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-apiservice-cert\") pod \"packageserver-d55dfcdfc-g99pk\" (UID: \"3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435696 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/548e19ee-14eb-4075-b9e3-69178800837c-service-ca-bundle\") pod \"router-default-5444994796-drs4q\" (UID: \"548e19ee-14eb-4075-b9e3-69178800837c\") " pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435763 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435790 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-etcd-ca\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435811 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6qkt\" (UniqueName: \"kubernetes.io/projected/fd936d68-81ed-4923-8078-5ad0116d532e-kube-api-access-j6qkt\") pod \"migrator-59844c95c7-wld5v\" (UID: \"fd936d68-81ed-4923-8078-5ad0116d532e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wld5v" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435884 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jhxcl\" (UID: \"cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435904 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dspc4\" (UniqueName: \"kubernetes.io/projected/3a74e1e8-3928-4220-b55d-ee42585ef1ee-kube-api-access-dspc4\") pod \"cluster-samples-operator-665b6dd947-6swxn\" (UID: \"3a74e1e8-3928-4220-b55d-ee42585ef1ee\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6swxn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435951 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.436063 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c9f8ace1-247f-4128-b3f7-95037fb1a156-machine-approver-tls\") pod \"machine-approver-56656f9798-tdq97\" (UID: \"c9f8ace1-247f-4128-b3f7-95037fb1a156\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.436081 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d3811a82-b0fe-4e06-948a-79cbbc840a98-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bltk5\" (UID: \"d3811a82-b0fe-4e06-948a-79cbbc840a98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.436128 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-h5mxt\" (UID: \"d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.436174 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p77q9\" (UniqueName: \"kubernetes.io/projected/6e21b584-0781-4fa9-8811-332d42755c17-kube-api-access-p77q9\") pod \"machine-config-controller-84d6567774-shncx\" (UID: \"6e21b584-0781-4fa9-8811-332d42755c17\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-shncx" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.436202 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c9f8ace1-247f-4128-b3f7-95037fb1a156-auth-proxy-config\") pod \"machine-approver-56656f9798-tdq97\" (UID: \"c9f8ace1-247f-4128-b3f7-95037fb1a156\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.436312 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a74e1e8-3928-4220-b55d-ee42585ef1ee-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-6swxn\" (UID: \"3a74e1e8-3928-4220-b55d-ee42585ef1ee\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6swxn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.436361 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8lcv\" (UniqueName: \"kubernetes.io/projected/c9f8ace1-247f-4128-b3f7-95037fb1a156-kube-api-access-w8lcv\") pod \"machine-approver-56656f9798-tdq97\" (UID: \"c9f8ace1-247f-4128-b3f7-95037fb1a156\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.436383 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9fed4c23-4a16-4502-87eb-d1dd68aa1af5-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2qwgb\" (UID: \"9fed4c23-4a16-4502-87eb-d1dd68aa1af5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2qwgb" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.436427 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxfnr\" (UniqueName: \"kubernetes.io/projected/9fed4c23-4a16-4502-87eb-d1dd68aa1af5-kube-api-access-qxfnr\") pod \"multus-admission-controller-857f4d67dd-2qwgb\" (UID: \"9fed4c23-4a16-4502-87eb-d1dd68aa1af5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2qwgb" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.438146 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.439065 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.439090 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39da2ba4-aebb-485b-8e46-7ffc36efa490-config-volume\") pod \"dns-default-lwhnh\" (UID: \"39da2ba4-aebb-485b-8e46-7ffc36efa490\") " pod="openshift-dns/dns-default-lwhnh" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.439135 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c9f8ace1-247f-4128-b3f7-95037fb1a156-auth-proxy-config\") pod \"machine-approver-56656f9798-tdq97\" (UID: \"c9f8ace1-247f-4128-b3f7-95037fb1a156\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.439169 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.439222 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.439332 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.439424 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-etcd-service-ca\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.439450 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g45bj\" (UniqueName: \"kubernetes.io/projected/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-kube-api-access-g45bj\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.439738 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/548e19ee-14eb-4075-b9e3-69178800837c-service-ca-bundle\") pod \"router-default-5444994796-drs4q\" (UID: \"548e19ee-14eb-4075-b9e3-69178800837c\") " pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:28 crc kubenswrapper[4713]: E0308 00:09:28.439816 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:28.93979933 +0000 UTC m=+223.059431563 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.439984 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a8c7be2b-608c-4089-b8a6-76bef69c3588-node-bootstrap-token\") pod \"machine-config-server-sxbdk\" (UID: \"a8c7be2b-608c-4089-b8a6-76bef69c3588\") " pod="openshift-machine-config-operator/machine-config-server-sxbdk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.440044 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.440429 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-795x2\" (UniqueName: \"kubernetes.io/projected/9e570b68-8b4c-42e3-839d-f37943999246-kube-api-access-795x2\") pod \"marketplace-operator-79b997595-p9hqz\" (UID: \"9e570b68-8b4c-42e3-839d-f37943999246\") " pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.440464 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a8c7be2b-608c-4089-b8a6-76bef69c3588-certs\") pod \"machine-config-server-sxbdk\" (UID: \"a8c7be2b-608c-4089-b8a6-76bef69c3588\") " pod="openshift-machine-config-operator/machine-config-server-sxbdk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.440501 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ccf0e825-0465-40ae-b0ca-f4f7c377e518-metrics-tls\") pod \"dns-operator-744455d44c-xr24g\" (UID: \"ccf0e825-0465-40ae-b0ca-f4f7c377e518\") " pod="openshift-dns-operator/dns-operator-744455d44c-xr24g" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.440542 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6wlf\" (UniqueName: \"kubernetes.io/projected/141fc694-b9ce-4b84-9e39-0e79a487e398-kube-api-access-j6wlf\") pod \"kube-storage-version-migrator-operator-b67b599dd-zvsbq\" (UID: \"141fc694-b9ce-4b84-9e39-0e79a487e398\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.440589 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.442537 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c9f8ace1-247f-4128-b3f7-95037fb1a156-machine-approver-tls\") pod \"machine-approver-56656f9798-tdq97\" (UID: \"c9f8ace1-247f-4128-b3f7-95037fb1a156\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.445345 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/548e19ee-14eb-4075-b9e3-69178800837c-stats-auth\") pod \"router-default-5444994796-drs4q\" (UID: \"548e19ee-14eb-4075-b9e3-69178800837c\") " pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447071 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447083 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tch6h\" (UniqueName: \"kubernetes.io/projected/496a4fbf-c338-4b64-96a5-dda456094c28-kube-api-access-tch6h\") pod \"machine-config-operator-74547568cd-q7bjv\" (UID: \"496a4fbf-c338-4b64-96a5-dda456094c28\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447151 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p9hqz\" (UID: \"9e570b68-8b4c-42e3-839d-f37943999246\") " pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447196 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsn7h\" (UniqueName: \"kubernetes.io/projected/a8c7be2b-608c-4089-b8a6-76bef69c3588-kube-api-access-bsn7h\") pod \"machine-config-server-sxbdk\" (UID: \"a8c7be2b-608c-4089-b8a6-76bef69c3588\") " pod="openshift-machine-config-operator/machine-config-server-sxbdk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447231 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp6ps\" (UniqueName: \"kubernetes.io/projected/c9df8d9c-b59f-4a1c-9fb4-668123290569-kube-api-access-mp6ps\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447262 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/899ec382-6c79-460e-9e3c-9dfb25867855-serving-cert\") pod \"service-ca-operator-777779d784-5bltg\" (UID: \"899ec382-6c79-460e-9e3c-9dfb25867855\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5bltg" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447311 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5eb834dd-5358-45c4-bbca-50baf0e8656b-profile-collector-cert\") pod \"catalog-operator-68c6474976-bn56j\" (UID: \"5eb834dd-5358-45c4-bbca-50baf0e8656b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447336 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2be1cb07-55b6-4220-989e-13415c3156b2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-pvc8t\" (UID: \"2be1cb07-55b6-4220-989e-13415c3156b2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447484 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8vz2\" (UniqueName: \"kubernetes.io/projected/063a79dd-fbe8-4562-98bc-deb309b25182-kube-api-access-m8vz2\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447541 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/548e19ee-14eb-4075-b9e3-69178800837c-default-certificate\") pod \"router-default-5444994796-drs4q\" (UID: \"548e19ee-14eb-4075-b9e3-69178800837c\") " pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447563 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/548e19ee-14eb-4075-b9e3-69178800837c-metrics-certs\") pod \"router-default-5444994796-drs4q\" (UID: \"548e19ee-14eb-4075-b9e3-69178800837c\") " pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447612 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/158ba4b3-9da3-4a83-95dd-e625c7b19a2b-cert\") pod \"ingress-canary-xmjhj\" (UID: \"158ba4b3-9da3-4a83-95dd-e625c7b19a2b\") " pod="openshift-ingress-canary/ingress-canary-xmjhj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447633 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k27jn\" (UniqueName: \"kubernetes.io/projected/d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4-kube-api-access-k27jn\") pod \"package-server-manager-789f6589d5-h5mxt\" (UID: \"d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447675 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c9df8d9c-b59f-4a1c-9fb4-668123290569-audit-dir\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447696 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9f8ace1-247f-4128-b3f7-95037fb1a156-config\") pod \"machine-approver-56656f9798-tdq97\" (UID: \"c9f8ace1-247f-4128-b3f7-95037fb1a156\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447719 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/063a79dd-fbe8-4562-98bc-deb309b25182-plugins-dir\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447740 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmkpx\" (UniqueName: \"kubernetes.io/projected/39da2ba4-aebb-485b-8e46-7ffc36efa490-kube-api-access-mmkpx\") pod \"dns-default-lwhnh\" (UID: \"39da2ba4-aebb-485b-8e46-7ffc36efa490\") " pod="openshift-dns/dns-default-lwhnh" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447759 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/899ec382-6c79-460e-9e3c-9dfb25867855-config\") pod \"service-ca-operator-777779d784-5bltg\" (UID: \"899ec382-6c79-460e-9e3c-9dfb25867855\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5bltg" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447781 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dbf7b38-8980-49e5-956c-08e443912846-config\") pod \"kube-controller-manager-operator-78b949d7b-4p529\" (UID: \"0dbf7b38-8980-49e5-956c-08e443912846\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447802 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpf9l\" (UniqueName: \"kubernetes.io/projected/ee63f184-4609-43d4-bdc1-2c840aef6d7f-kube-api-access-rpf9l\") pod \"service-ca-9c57cc56f-c4nq5\" (UID: \"ee63f184-4609-43d4-bdc1-2c840aef6d7f\") " pod="openshift-service-ca/service-ca-9c57cc56f-c4nq5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447862 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d587l\" (UniqueName: \"kubernetes.io/projected/d3811a82-b0fe-4e06-948a-79cbbc840a98-kube-api-access-d587l\") pod \"ingress-operator-5b745b69d9-bltk5\" (UID: \"d3811a82-b0fe-4e06-948a-79cbbc840a98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447886 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/063a79dd-fbe8-4562-98bc-deb309b25182-csi-data-dir\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447911 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jhxcl\" (UID: \"cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447938 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvzk6\" (UniqueName: \"kubernetes.io/projected/158ba4b3-9da3-4a83-95dd-e625c7b19a2b-kube-api-access-lvzk6\") pod \"ingress-canary-xmjhj\" (UID: \"158ba4b3-9da3-4a83-95dd-e625c7b19a2b\") " pod="openshift-ingress-canary/ingress-canary-xmjhj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448020 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jhxcl\" (UID: \"cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448040 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-tmpfs\") pod \"packageserver-d55dfcdfc-g99pk\" (UID: \"3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448061 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e21b584-0781-4fa9-8811-332d42755c17-proxy-tls\") pod \"machine-config-controller-84d6567774-shncx\" (UID: \"6e21b584-0781-4fa9-8811-332d42755c17\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-shncx" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448083 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-config\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448107 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0dbf7b38-8980-49e5-956c-08e443912846-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4p529\" (UID: \"0dbf7b38-8980-49e5-956c-08e443912846\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448111 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c9df8d9c-b59f-4a1c-9fb4-668123290569-audit-dir\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448127 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-268pq\" (UniqueName: \"kubernetes.io/projected/899ec382-6c79-460e-9e3c-9dfb25867855-kube-api-access-268pq\") pod \"service-ca-operator-777779d784-5bltg\" (UID: \"899ec382-6c79-460e-9e3c-9dfb25867855\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5bltg" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448149 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-webhook-cert\") pod \"packageserver-d55dfcdfc-g99pk\" (UID: \"3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448190 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/496a4fbf-c338-4b64-96a5-dda456094c28-auth-proxy-config\") pod \"machine-config-operator-74547568cd-q7bjv\" (UID: \"496a4fbf-c338-4b64-96a5-dda456094c28\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448224 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0d2f415a-2626-45f9-baf0-68ab25b9d079-srv-cert\") pod \"olm-operator-6b444d44fb-8m94r\" (UID: \"0d2f415a-2626-45f9-baf0-68ab25b9d079\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448262 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6e21b584-0781-4fa9-8811-332d42755c17-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-shncx\" (UID: \"6e21b584-0781-4fa9-8811-332d42755c17\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-shncx" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448306 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448332 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448354 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ee63f184-4609-43d4-bdc1-2c840aef6d7f-signing-cabundle\") pod \"service-ca-9c57cc56f-c4nq5\" (UID: \"ee63f184-4609-43d4-bdc1-2c840aef6d7f\") " pod="openshift-service-ca/service-ca-9c57cc56f-c4nq5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448375 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39da2ba4-aebb-485b-8e46-7ffc36efa490-metrics-tls\") pod \"dns-default-lwhnh\" (UID: \"39da2ba4-aebb-485b-8e46-7ffc36efa490\") " pod="openshift-dns/dns-default-lwhnh" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448400 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f878574f-5b4a-4a3f-9b2b-e8eeb569f0fc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-7wd77\" (UID: \"f878574f-5b4a-4a3f-9b2b-e8eeb569f0fc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wd77" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448423 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-serving-cert\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448442 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/141fc694-b9ce-4b84-9e39-0e79a487e398-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zvsbq\" (UID: \"141fc694-b9ce-4b84-9e39-0e79a487e398\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448482 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/496a4fbf-c338-4b64-96a5-dda456094c28-proxy-tls\") pod \"machine-config-operator-74547568cd-q7bjv\" (UID: \"496a4fbf-c338-4b64-96a5-dda456094c28\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448501 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmkds\" (UniqueName: \"kubernetes.io/projected/548e19ee-14eb-4075-b9e3-69178800837c-kube-api-access-wmkds\") pod \"router-default-5444994796-drs4q\" (UID: \"548e19ee-14eb-4075-b9e3-69178800837c\") " pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448520 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448538 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x9m6\" (UniqueName: \"kubernetes.io/projected/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-kube-api-access-4x9m6\") pod \"packageserver-d55dfcdfc-g99pk\" (UID: \"3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448558 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nk4f\" (UniqueName: \"kubernetes.io/projected/f878574f-5b4a-4a3f-9b2b-e8eeb569f0fc-kube-api-access-7nk4f\") pod \"control-plane-machine-set-operator-78cbb6b69f-7wd77\" (UID: \"f878574f-5b4a-4a3f-9b2b-e8eeb569f0fc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wd77" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448580 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be1cb07-55b6-4220-989e-13415c3156b2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-pvc8t\" (UID: \"2be1cb07-55b6-4220-989e-13415c3156b2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448601 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3811a82-b0fe-4e06-948a-79cbbc840a98-metrics-tls\") pod \"ingress-operator-5b745b69d9-bltk5\" (UID: \"d3811a82-b0fe-4e06-948a-79cbbc840a98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448598 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9f8ace1-247f-4128-b3f7-95037fb1a156-config\") pod \"machine-approver-56656f9798-tdq97\" (UID: \"c9f8ace1-247f-4128-b3f7-95037fb1a156\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448626 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-audit-policies\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448648 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrkff\" (UniqueName: \"kubernetes.io/projected/fdccd72c-79d7-4388-926e-0539c571dafe-kube-api-access-hrkff\") pod \"auto-csr-approver-29548808-nd57l\" (UID: \"fdccd72c-79d7-4388-926e-0539c571dafe\") " pod="openshift-infra/auto-csr-approver-29548808-nd57l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448669 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0dbf7b38-8980-49e5-956c-08e443912846-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4p529\" (UID: \"0dbf7b38-8980-49e5-956c-08e443912846\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448690 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448708 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d3811a82-b0fe-4e06-948a-79cbbc840a98-trusted-ca\") pod \"ingress-operator-5b745b69d9-bltk5\" (UID: \"d3811a82-b0fe-4e06-948a-79cbbc840a98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448728 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ee63f184-4609-43d4-bdc1-2c840aef6d7f-signing-key\") pod \"service-ca-9c57cc56f-c4nq5\" (UID: \"ee63f184-4609-43d4-bdc1-2c840aef6d7f\") " pod="openshift-service-ca/service-ca-9c57cc56f-c4nq5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448800 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p9hqz\" (UID: \"9e570b68-8b4c-42e3-839d-f37943999246\") " pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.450033 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0d2f415a-2626-45f9-baf0-68ab25b9d079-profile-collector-cert\") pod \"olm-operator-6b444d44fb-8m94r\" (UID: \"0d2f415a-2626-45f9-baf0-68ab25b9d079\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.450040 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-audit-policies\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.450242 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be1cb07-55b6-4220-989e-13415c3156b2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-pvc8t\" (UID: \"2be1cb07-55b6-4220-989e-13415c3156b2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.450627 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/496a4fbf-c338-4b64-96a5-dda456094c28-auth-proxy-config\") pod \"machine-config-operator-74547568cd-q7bjv\" (UID: \"496a4fbf-c338-4b64-96a5-dda456094c28\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.451156 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d3811a82-b0fe-4e06-948a-79cbbc840a98-trusted-ca\") pod \"ingress-operator-5b745b69d9-bltk5\" (UID: \"d3811a82-b0fe-4e06-948a-79cbbc840a98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.451368 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6e21b584-0781-4fa9-8811-332d42755c17-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-shncx\" (UID: \"6e21b584-0781-4fa9-8811-332d42755c17\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-shncx" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.453161 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.453285 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.453426 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3811a82-b0fe-4e06-948a-79cbbc840a98-metrics-tls\") pod \"ingress-operator-5b745b69d9-bltk5\" (UID: \"d3811a82-b0fe-4e06-948a-79cbbc840a98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.453503 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/548e19ee-14eb-4075-b9e3-69178800837c-metrics-certs\") pod \"router-default-5444994796-drs4q\" (UID: \"548e19ee-14eb-4075-b9e3-69178800837c\") " pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.454244 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.454652 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.456046 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.459276 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.474169 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.494047 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.514055 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.533116 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.540048 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.540272 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.542528 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0dbf7b38-8980-49e5-956c-08e443912846-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4p529\" (UID: \"0dbf7b38-8980-49e5-956c-08e443912846\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.551295 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:28 crc kubenswrapper[4713]: E0308 00:09:28.551790 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.051447485 +0000 UTC m=+223.171079728 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.552100 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsn7h\" (UniqueName: \"kubernetes.io/projected/a8c7be2b-608c-4089-b8a6-76bef69c3588-kube-api-access-bsn7h\") pod \"machine-config-server-sxbdk\" (UID: \"a8c7be2b-608c-4089-b8a6-76bef69c3588\") " pod="openshift-machine-config-operator/machine-config-server-sxbdk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.552284 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/899ec382-6c79-460e-9e3c-9dfb25867855-serving-cert\") pod \"service-ca-operator-777779d784-5bltg\" (UID: \"899ec382-6c79-460e-9e3c-9dfb25867855\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5bltg" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.552313 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8vz2\" (UniqueName: \"kubernetes.io/projected/063a79dd-fbe8-4562-98bc-deb309b25182-kube-api-access-m8vz2\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.552398 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/158ba4b3-9da3-4a83-95dd-e625c7b19a2b-cert\") pod \"ingress-canary-xmjhj\" (UID: \"158ba4b3-9da3-4a83-95dd-e625c7b19a2b\") " pod="openshift-ingress-canary/ingress-canary-xmjhj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.552421 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/063a79dd-fbe8-4562-98bc-deb309b25182-plugins-dir\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.552522 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmkpx\" (UniqueName: \"kubernetes.io/projected/39da2ba4-aebb-485b-8e46-7ffc36efa490-kube-api-access-mmkpx\") pod \"dns-default-lwhnh\" (UID: \"39da2ba4-aebb-485b-8e46-7ffc36efa490\") " pod="openshift-dns/dns-default-lwhnh" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.552555 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/899ec382-6c79-460e-9e3c-9dfb25867855-config\") pod \"service-ca-operator-777779d784-5bltg\" (UID: \"899ec382-6c79-460e-9e3c-9dfb25867855\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5bltg" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.552581 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpf9l\" (UniqueName: \"kubernetes.io/projected/ee63f184-4609-43d4-bdc1-2c840aef6d7f-kube-api-access-rpf9l\") pod \"service-ca-9c57cc56f-c4nq5\" (UID: \"ee63f184-4609-43d4-bdc1-2c840aef6d7f\") " pod="openshift-service-ca/service-ca-9c57cc56f-c4nq5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.552612 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/063a79dd-fbe8-4562-98bc-deb309b25182-csi-data-dir\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.552755 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/063a79dd-fbe8-4562-98bc-deb309b25182-plugins-dir\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.552794 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvzk6\" (UniqueName: \"kubernetes.io/projected/158ba4b3-9da3-4a83-95dd-e625c7b19a2b-kube-api-access-lvzk6\") pod \"ingress-canary-xmjhj\" (UID: \"158ba4b3-9da3-4a83-95dd-e625c7b19a2b\") " pod="openshift-ingress-canary/ingress-canary-xmjhj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.552799 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/063a79dd-fbe8-4562-98bc-deb309b25182-csi-data-dir\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.552859 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-tmpfs\") pod \"packageserver-d55dfcdfc-g99pk\" (UID: \"3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.552913 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-268pq\" (UniqueName: \"kubernetes.io/projected/899ec382-6c79-460e-9e3c-9dfb25867855-kube-api-access-268pq\") pod \"service-ca-operator-777779d784-5bltg\" (UID: \"899ec382-6c79-460e-9e3c-9dfb25867855\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5bltg" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.552973 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-webhook-cert\") pod \"packageserver-d55dfcdfc-g99pk\" (UID: \"3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553063 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553193 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ee63f184-4609-43d4-bdc1-2c840aef6d7f-signing-cabundle\") pod \"service-ca-9c57cc56f-c4nq5\" (UID: \"ee63f184-4609-43d4-bdc1-2c840aef6d7f\") " pod="openshift-service-ca/service-ca-9c57cc56f-c4nq5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553216 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39da2ba4-aebb-485b-8e46-7ffc36efa490-metrics-tls\") pod \"dns-default-lwhnh\" (UID: \"39da2ba4-aebb-485b-8e46-7ffc36efa490\") " pod="openshift-dns/dns-default-lwhnh" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553261 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x9m6\" (UniqueName: \"kubernetes.io/projected/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-kube-api-access-4x9m6\") pod \"packageserver-d55dfcdfc-g99pk\" (UID: \"3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553300 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ee63f184-4609-43d4-bdc1-2c840aef6d7f-signing-key\") pod \"service-ca-9c57cc56f-c4nq5\" (UID: \"ee63f184-4609-43d4-bdc1-2c840aef6d7f\") " pod="openshift-service-ca/service-ca-9c57cc56f-c4nq5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553310 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-tmpfs\") pod \"packageserver-d55dfcdfc-g99pk\" (UID: \"3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553358 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/063a79dd-fbe8-4562-98bc-deb309b25182-mountpoint-dir\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553393 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/063a79dd-fbe8-4562-98bc-deb309b25182-socket-dir\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553421 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/063a79dd-fbe8-4562-98bc-deb309b25182-registration-dir\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553437 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-apiservice-cert\") pod \"packageserver-d55dfcdfc-g99pk\" (UID: \"3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553437 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/063a79dd-fbe8-4562-98bc-deb309b25182-mountpoint-dir\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553477 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/063a79dd-fbe8-4562-98bc-deb309b25182-socket-dir\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553498 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/063a79dd-fbe8-4562-98bc-deb309b25182-registration-dir\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553574 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39da2ba4-aebb-485b-8e46-7ffc36efa490-config-volume\") pod \"dns-default-lwhnh\" (UID: \"39da2ba4-aebb-485b-8e46-7ffc36efa490\") " pod="openshift-dns/dns-default-lwhnh" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553674 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553702 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a8c7be2b-608c-4089-b8a6-76bef69c3588-node-bootstrap-token\") pod \"machine-config-server-sxbdk\" (UID: \"a8c7be2b-608c-4089-b8a6-76bef69c3588\") " pod="openshift-machine-config-operator/machine-config-server-sxbdk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553725 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a8c7be2b-608c-4089-b8a6-76bef69c3588-certs\") pod \"machine-config-server-sxbdk\" (UID: \"a8c7be2b-608c-4089-b8a6-76bef69c3588\") " pod="openshift-machine-config-operator/machine-config-server-sxbdk" Mar 08 00:09:28 crc kubenswrapper[4713]: E0308 00:09:28.554147 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.054137363 +0000 UTC m=+223.173769596 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.556576 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"e5b834fc84e3d300046cd1fdbffb156a0e873fcbfbfe0a7c813e27e35445753c"} Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.556680 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"435cc4d28c45eb6127b40eadb5213f6a7bded3488d572996cdfa93f02b79b622"} Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.556920 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.559199 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"644fcf93fd59fdbb47c6c87645c6873caee77e45d0017c72c213bddca9a014ef"} Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.559250 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"433a99e1791d6106056165c414a7ac15d22ecdfc4eef2654050d166efe18a4ff"} Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.559408 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dbf7b38-8980-49e5-956c-08e443912846-config\") pod \"kube-controller-manager-operator-78b949d7b-4p529\" (UID: \"0dbf7b38-8980-49e5-956c-08e443912846\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.573281 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.584155 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ccf0e825-0465-40ae-b0ca-f4f7c377e518-metrics-tls\") pod \"dns-operator-744455d44c-xr24g\" (UID: \"ccf0e825-0465-40ae-b0ca-f4f7c377e518\") " pod="openshift-dns-operator/dns-operator-744455d44c-xr24g" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.593240 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.613424 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.633511 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.653965 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.654612 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:28 crc kubenswrapper[4713]: E0308 00:09:28.654855 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.154813792 +0000 UTC m=+223.274446015 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.655289 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: E0308 00:09:28.655844 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.155813977 +0000 UTC m=+223.275446210 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.673372 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.684410 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/141fc694-b9ce-4b84-9e39-0e79a487e398-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zvsbq\" (UID: \"141fc694-b9ce-4b84-9e39-0e79a487e398\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.693898 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.700685 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/141fc694-b9ce-4b84-9e39-0e79a487e398-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zvsbq\" (UID: \"141fc694-b9ce-4b84-9e39-0e79a487e398\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.713579 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.753385 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.756201 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:28 crc kubenswrapper[4713]: E0308 00:09:28.756333 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.256316023 +0000 UTC m=+223.375948266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.756509 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: E0308 00:09:28.756841 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.256817645 +0000 UTC m=+223.376449878 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.773597 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.792910 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.801814 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jhxcl\" (UID: \"cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.812756 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.819723 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jhxcl\" (UID: \"cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.832916 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.853610 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.858134 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:28 crc kubenswrapper[4713]: E0308 00:09:28.858230 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.358211373 +0000 UTC m=+223.477843606 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.858377 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: E0308 00:09:28.858698 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.358688855 +0000 UTC m=+223.478321088 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.872806 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.885487 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a74e1e8-3928-4220-b55d-ee42585ef1ee-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-6swxn\" (UID: \"3a74e1e8-3928-4220-b55d-ee42585ef1ee\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6swxn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.892466 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.912814 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.916601 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/496a4fbf-c338-4b64-96a5-dda456094c28-images\") pod \"machine-config-operator-74547568cd-q7bjv\" (UID: \"496a4fbf-c338-4b64-96a5-dda456094c28\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.933461 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.953289 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.960335 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:28 crc kubenswrapper[4713]: E0308 00:09:28.960404 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.46038662 +0000 UTC m=+223.580018853 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.960958 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: E0308 00:09:28.961303 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.461292313 +0000 UTC m=+223.580924546 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.962267 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/496a4fbf-c338-4b64-96a5-dda456094c28-proxy-tls\") pod \"machine-config-operator-74547568cd-q7bjv\" (UID: \"496a4fbf-c338-4b64-96a5-dda456094c28\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.973652 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.983602 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f878574f-5b4a-4a3f-9b2b-e8eeb569f0fc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-7wd77\" (UID: \"f878574f-5b4a-4a3f-9b2b-e8eeb569f0fc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wd77" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.994026 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.012801 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.033397 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.053706 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.061749 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.061963 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.561939132 +0000 UTC m=+223.681571375 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.062057 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.062502 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.562484086 +0000 UTC m=+223.682116319 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.073454 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.083988 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9fed4c23-4a16-4502-87eb-d1dd68aa1af5-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2qwgb\" (UID: \"9fed4c23-4a16-4502-87eb-d1dd68aa1af5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2qwgb" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.093462 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.113264 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.132961 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.139475 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5eb834dd-5358-45c4-bbca-50baf0e8656b-srv-cert\") pod \"catalog-operator-68c6474976-bn56j\" (UID: \"5eb834dd-5358-45c4-bbca-50baf0e8656b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.152941 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.163532 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.163633 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.663609147 +0000 UTC m=+223.783241370 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.163731 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.164057 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.664049218 +0000 UTC m=+223.783681451 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.172999 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.181462 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5eb834dd-5358-45c4-bbca-50baf0e8656b-profile-collector-cert\") pod \"catalog-operator-68c6474976-bn56j\" (UID: \"5eb834dd-5358-45c4-bbca-50baf0e8656b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.183882 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0d2f415a-2626-45f9-baf0-68ab25b9d079-profile-collector-cert\") pod \"olm-operator-6b444d44fb-8m94r\" (UID: \"0d2f415a-2626-45f9-baf0-68ab25b9d079\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.192867 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.211903 4713 request.go:700] Waited for 1.005869394s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmcc-proxy-tls&limit=500&resourceVersion=0 Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.213750 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.221533 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e21b584-0781-4fa9-8811-332d42755c17-proxy-tls\") pod \"machine-config-controller-84d6567774-shncx\" (UID: \"6e21b584-0781-4fa9-8811-332d42755c17\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-shncx" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.233175 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.252813 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.262138 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-etcd-service-ca\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.265251 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.265401 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.765368034 +0000 UTC m=+223.885000277 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.265852 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.266384 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.766357848 +0000 UTC m=+223.885990091 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.273284 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.278092 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-etcd-client\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.294359 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.313585 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.322910 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-serving-cert\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.333214 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.352951 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.366563 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.366693 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.866677349 +0000 UTC m=+223.986309582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.366890 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.367227 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.867219623 +0000 UTC m=+223.986851856 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.373675 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.380162 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-config\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.393789 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.401448 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-etcd-ca\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.413167 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.433775 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.440140 4713 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.440254 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4-package-server-manager-serving-cert podName:d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.940220727 +0000 UTC m=+224.059852990 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4-package-server-manager-serving-cert") pod "package-server-manager-789f6589d5-h5mxt" (UID: "d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.447961 4713 configmap.go:193] Couldn't get configMap openshift-marketplace/marketplace-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.448164 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-trusted-ca podName:9e570b68-8b4c-42e3-839d-f37943999246 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.948136146 +0000 UTC m=+224.067768409 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-trusted-ca" (UniqueName: "kubernetes.io/configmap/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-trusted-ca") pod "marketplace-operator-79b997595-p9hqz" (UID: "9e570b68-8b4c-42e3-839d-f37943999246") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.450656 4713 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.450689 4713 secret.go:188] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.452068 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d2f415a-2626-45f9-baf0-68ab25b9d079-srv-cert podName:0d2f415a-2626-45f9-baf0-68ab25b9d079 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.950752202 +0000 UTC m=+224.070384475 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/0d2f415a-2626-45f9-baf0-68ab25b9d079-srv-cert") pod "olm-operator-6b444d44fb-8m94r" (UID: "0d2f415a-2626-45f9-baf0-68ab25b9d079") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.452123 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-operator-metrics podName:9e570b68-8b4c-42e3-839d-f37943999246 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.952104366 +0000 UTC m=+224.071736639 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-operator-metrics") pod "marketplace-operator-79b997595-p9hqz" (UID: "9e570b68-8b4c-42e3-839d-f37943999246") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.454015 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.468299 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.469734 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.969703098 +0000 UTC m=+224.089335371 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.484396 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.493908 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.534434 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjfj6\" (UniqueName: \"kubernetes.io/projected/c6893b56-2395-4f91-9349-c23b48b957c8-kube-api-access-hjfj6\") pod \"machine-api-operator-5694c8668f-dkkh7\" (UID: \"c6893b56-2395-4f91-9349-c23b48b957c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.547646 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pt9w\" (UniqueName: \"kubernetes.io/projected/10940629-a0dc-4828-a913-20a754f4896b-kube-api-access-7pt9w\") pod \"authentication-operator-69f744f599-fhq98\" (UID: \"10940629-a0dc-4828-a913-20a754f4896b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.552503 4713 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.552539 4713 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.552601 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/899ec382-6c79-460e-9e3c-9dfb25867855-serving-cert podName:899ec382-6c79-460e-9e3c-9dfb25867855 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.0525824 +0000 UTC m=+224.172214633 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/899ec382-6c79-460e-9e3c-9dfb25867855-serving-cert") pod "service-ca-operator-777779d784-5bltg" (UID: "899ec382-6c79-460e-9e3c-9dfb25867855") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.552716 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/158ba4b3-9da3-4a83-95dd-e625c7b19a2b-cert podName:158ba4b3-9da3-4a83-95dd-e625c7b19a2b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.052610791 +0000 UTC m=+224.172243024 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/158ba4b3-9da3-4a83-95dd-e625c7b19a2b-cert") pod "ingress-canary-xmjhj" (UID: "158ba4b3-9da3-4a83-95dd-e625c7b19a2b") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.552793 4713 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.552838 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/899ec382-6c79-460e-9e3c-9dfb25867855-config podName:899ec382-6c79-460e-9e3c-9dfb25867855 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.052814786 +0000 UTC m=+224.172447019 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/899ec382-6c79-460e-9e3c-9dfb25867855-config") pod "service-ca-operator-777779d784-5bltg" (UID: "899ec382-6c79-460e-9e3c-9dfb25867855") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.554162 4713 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.554288 4713 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.554321 4713 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.554338 4713 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.554348 4713 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.554388 4713 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.554292 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8c7be2b-608c-4089-b8a6-76bef69c3588-node-bootstrap-token podName:a8c7be2b-608c-4089-b8a6-76bef69c3588 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.054251232 +0000 UTC m=+224.173883505 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/a8c7be2b-608c-4089-b8a6-76bef69c3588-node-bootstrap-token") pod "machine-config-server-sxbdk" (UID: "a8c7be2b-608c-4089-b8a6-76bef69c3588") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.554279 4713 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.554416 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8c7be2b-608c-4089-b8a6-76bef69c3588-certs podName:a8c7be2b-608c-4089-b8a6-76bef69c3588 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.054405046 +0000 UTC m=+224.174037279 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/a8c7be2b-608c-4089-b8a6-76bef69c3588-certs") pod "machine-config-server-sxbdk" (UID: "a8c7be2b-608c-4089-b8a6-76bef69c3588") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.554442 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-apiservice-cert podName:3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.054424887 +0000 UTC m=+224.174057120 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-apiservice-cert") pod "packageserver-d55dfcdfc-g99pk" (UID: "3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.554463 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ee63f184-4609-43d4-bdc1-2c840aef6d7f-signing-cabundle podName:ee63f184-4609-43d4-bdc1-2c840aef6d7f nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.054453907 +0000 UTC m=+224.174086240 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/ee63f184-4609-43d4-bdc1-2c840aef6d7f-signing-cabundle") pod "service-ca-9c57cc56f-c4nq5" (UID: "ee63f184-4609-43d4-bdc1-2c840aef6d7f") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.554481 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee63f184-4609-43d4-bdc1-2c840aef6d7f-signing-key podName:ee63f184-4609-43d4-bdc1-2c840aef6d7f nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.054473078 +0000 UTC m=+224.174105411 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/ee63f184-4609-43d4-bdc1-2c840aef6d7f-signing-key") pod "service-ca-9c57cc56f-c4nq5" (UID: "ee63f184-4609-43d4-bdc1-2c840aef6d7f") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.554495 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-webhook-cert podName:3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.054487998 +0000 UTC m=+224.174120341 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-webhook-cert") pod "packageserver-d55dfcdfc-g99pk" (UID: "3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.554326 4713 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.554505 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39da2ba4-aebb-485b-8e46-7ffc36efa490-metrics-tls podName:39da2ba4-aebb-485b-8e46-7ffc36efa490 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.054501169 +0000 UTC m=+224.174133402 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/39da2ba4-aebb-485b-8e46-7ffc36efa490-metrics-tls") pod "dns-default-lwhnh" (UID: "39da2ba4-aebb-485b-8e46-7ffc36efa490") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.554527 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/39da2ba4-aebb-485b-8e46-7ffc36efa490-config-volume podName:39da2ba4-aebb-485b-8e46-7ffc36efa490 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.054518399 +0000 UTC m=+224.174150742 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/39da2ba4-aebb-485b-8e46-7ffc36efa490-config-volume") pod "dns-default-lwhnh" (UID: "39da2ba4-aebb-485b-8e46-7ffc36efa490") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.554643 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.568573 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz4bd\" (UniqueName: \"kubernetes.io/projected/c61cbc0b-441e-4704-accf-35963b3758aa-kube-api-access-tz4bd\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.570459 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.570884 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.07087006 +0000 UTC m=+224.190502293 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.588102 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-549nc\" (UniqueName: \"kubernetes.io/projected/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-kube-api-access-549nc\") pod \"controller-manager-879f6c89f-4xznw\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.613542 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5ghw\" (UniqueName: \"kubernetes.io/projected/bfa92863-23f8-42d4-8e73-433bf546d304-kube-api-access-q5ghw\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.622265 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.639615 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzcz5\" (UniqueName: \"kubernetes.io/projected/c5cc5125-93f0-4709-afbd-7aa6a888b641-kube-api-access-fzcz5\") pod \"route-controller-manager-6576b87f9c-7snq7\" (UID: \"c5cc5125-93f0-4709-afbd-7aa6a888b641\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.648531 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdzxf\" (UniqueName: \"kubernetes.io/projected/8e76411a-c4c2-4822-9ec9-a7e73c15f7ec-kube-api-access-sdzxf\") pod \"openshift-apiserver-operator-796bbdcf4f-lg6jl\" (UID: \"8e76411a-c4c2-4822-9ec9-a7e73c15f7ec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.653243 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.655650 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.671787 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.671978 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.17194864 +0000 UTC m=+224.291580883 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.672295 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.672692 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.172680238 +0000 UTC m=+224.292312471 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.673774 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.693999 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.713617 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.732775 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.746057 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-fhq98"] Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.753094 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 00:09:29 crc kubenswrapper[4713]: W0308 00:09:29.763775 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10940629_a0dc_4828_a913_20a754f4896b.slice/crio-2d50ddd00ecb585fac16ea196ec00bce2d2c4db3abf5dd9994fc43c3faed8cad WatchSource:0}: Error finding container 2d50ddd00ecb585fac16ea196ec00bce2d2c4db3abf5dd9994fc43c3faed8cad: Status 404 returned error can't find the container with id 2d50ddd00ecb585fac16ea196ec00bce2d2c4db3abf5dd9994fc43c3faed8cad Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.767203 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.773435 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.773795 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.273755658 +0000 UTC m=+224.393387891 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.773884 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.773930 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.774441 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.274428465 +0000 UTC m=+224.394060698 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.793289 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.813448 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.820337 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-dkkh7"] Mar 08 00:09:29 crc kubenswrapper[4713]: W0308 00:09:29.824873 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6893b56_2395_4f91_9349_c23b48b957c8.slice/crio-29a7c4c18a7333fd6b9259f4ff1a952ca8c0aef11eb27d81e32d45184ecd9ba3 WatchSource:0}: Error finding container 29a7c4c18a7333fd6b9259f4ff1a952ca8c0aef11eb27d81e32d45184ecd9ba3: Status 404 returned error can't find the container with id 29a7c4c18a7333fd6b9259f4ff1a952ca8c0aef11eb27d81e32d45184ecd9ba3 Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.825960 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.833980 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.847313 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l"] Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.855646 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.874740 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.875295 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.375276679 +0000 UTC m=+224.494908912 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.876943 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.894237 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.898217 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.915134 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.934852 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.935059 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.953083 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.953782 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-58c66"] Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.973564 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.976455 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0d2f415a-2626-45f9-baf0-68ab25b9d079-srv-cert\") pod \"olm-operator-6b444d44fb-8m94r\" (UID: \"0d2f415a-2626-45f9-baf0-68ab25b9d079\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.976698 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p9hqz\" (UID: \"9e570b68-8b4c-42e3-839d-f37943999246\") " pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.977774 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-h5mxt\" (UID: \"d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.977862 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.977917 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p9hqz\" (UID: \"9e570b68-8b4c-42e3-839d-f37943999246\") " pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.978249 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.478233656 +0000 UTC m=+224.597865889 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.979180 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p9hqz\" (UID: \"9e570b68-8b4c-42e3-839d-f37943999246\") " pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.986568 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0d2f415a-2626-45f9-baf0-68ab25b9d079-srv-cert\") pod \"olm-operator-6b444d44fb-8m94r\" (UID: \"0d2f415a-2626-45f9-baf0-68ab25b9d079\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.987526 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p9hqz\" (UID: \"9e570b68-8b4c-42e3-839d-f37943999246\") " pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.993140 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.994248 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-h5mxt\" (UID: \"d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.014898 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.021418 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4xznw"] Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.033606 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 08 00:09:30 crc kubenswrapper[4713]: W0308 00:09:30.038675 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4ba1fb6_83e1_4a29_93a5_5abf00f86718.slice/crio-a48c3b313279a8d19f79d36e4fdb5a5265b310ba5fe079364f758a6f08817617 WatchSource:0}: Error finding container a48c3b313279a8d19f79d36e4fdb5a5265b310ba5fe079364f758a6f08817617: Status 404 returned error can't find the container with id a48c3b313279a8d19f79d36e4fdb5a5265b310ba5fe079364f758a6f08817617 Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.054480 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.074704 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.078946 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.079212 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-apiservice-cert\") pod \"packageserver-d55dfcdfc-g99pk\" (UID: \"3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.079350 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39da2ba4-aebb-485b-8e46-7ffc36efa490-config-volume\") pod \"dns-default-lwhnh\" (UID: \"39da2ba4-aebb-485b-8e46-7ffc36efa490\") " pod="openshift-dns/dns-default-lwhnh" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.079406 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a8c7be2b-608c-4089-b8a6-76bef69c3588-node-bootstrap-token\") pod \"machine-config-server-sxbdk\" (UID: \"a8c7be2b-608c-4089-b8a6-76bef69c3588\") " pod="openshift-machine-config-operator/machine-config-server-sxbdk" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.079478 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a8c7be2b-608c-4089-b8a6-76bef69c3588-certs\") pod \"machine-config-server-sxbdk\" (UID: \"a8c7be2b-608c-4089-b8a6-76bef69c3588\") " pod="openshift-machine-config-operator/machine-config-server-sxbdk" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.079524 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/899ec382-6c79-460e-9e3c-9dfb25867855-serving-cert\") pod \"service-ca-operator-777779d784-5bltg\" (UID: \"899ec382-6c79-460e-9e3c-9dfb25867855\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5bltg" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.079575 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/158ba4b3-9da3-4a83-95dd-e625c7b19a2b-cert\") pod \"ingress-canary-xmjhj\" (UID: \"158ba4b3-9da3-4a83-95dd-e625c7b19a2b\") " pod="openshift-ingress-canary/ingress-canary-xmjhj" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.079606 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/899ec382-6c79-460e-9e3c-9dfb25867855-config\") pod \"service-ca-operator-777779d784-5bltg\" (UID: \"899ec382-6c79-460e-9e3c-9dfb25867855\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5bltg" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.079657 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-webhook-cert\") pod \"packageserver-d55dfcdfc-g99pk\" (UID: \"3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:30 crc kubenswrapper[4713]: E0308 00:09:30.079784 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.579747936 +0000 UTC m=+224.699380209 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.079891 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ee63f184-4609-43d4-bdc1-2c840aef6d7f-signing-cabundle\") pod \"service-ca-9c57cc56f-c4nq5\" (UID: \"ee63f184-4609-43d4-bdc1-2c840aef6d7f\") " pod="openshift-service-ca/service-ca-9c57cc56f-c4nq5" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.079918 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39da2ba4-aebb-485b-8e46-7ffc36efa490-metrics-tls\") pod \"dns-default-lwhnh\" (UID: \"39da2ba4-aebb-485b-8e46-7ffc36efa490\") " pod="openshift-dns/dns-default-lwhnh" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.079992 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ee63f184-4609-43d4-bdc1-2c840aef6d7f-signing-key\") pod \"service-ca-9c57cc56f-c4nq5\" (UID: \"ee63f184-4609-43d4-bdc1-2c840aef6d7f\") " pod="openshift-service-ca/service-ca-9c57cc56f-c4nq5" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.080457 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39da2ba4-aebb-485b-8e46-7ffc36efa490-config-volume\") pod \"dns-default-lwhnh\" (UID: \"39da2ba4-aebb-485b-8e46-7ffc36efa490\") " pod="openshift-dns/dns-default-lwhnh" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.080901 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl"] Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.080982 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ee63f184-4609-43d4-bdc1-2c840aef6d7f-signing-cabundle\") pod \"service-ca-9c57cc56f-c4nq5\" (UID: \"ee63f184-4609-43d4-bdc1-2c840aef6d7f\") " pod="openshift-service-ca/service-ca-9c57cc56f-c4nq5" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.081171 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/899ec382-6c79-460e-9e3c-9dfb25867855-config\") pod \"service-ca-operator-777779d784-5bltg\" (UID: \"899ec382-6c79-460e-9e3c-9dfb25867855\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5bltg" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.082930 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-webhook-cert\") pod \"packageserver-d55dfcdfc-g99pk\" (UID: \"3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.083219 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ee63f184-4609-43d4-bdc1-2c840aef6d7f-signing-key\") pod \"service-ca-9c57cc56f-c4nq5\" (UID: \"ee63f184-4609-43d4-bdc1-2c840aef6d7f\") " pod="openshift-service-ca/service-ca-9c57cc56f-c4nq5" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.093801 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39da2ba4-aebb-485b-8e46-7ffc36efa490-metrics-tls\") pod \"dns-default-lwhnh\" (UID: \"39da2ba4-aebb-485b-8e46-7ffc36efa490\") " pod="openshift-dns/dns-default-lwhnh" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.093896 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.093900 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-apiservice-cert\") pod \"packageserver-d55dfcdfc-g99pk\" (UID: \"3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.095364 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/158ba4b3-9da3-4a83-95dd-e625c7b19a2b-cert\") pod \"ingress-canary-xmjhj\" (UID: \"158ba4b3-9da3-4a83-95dd-e625c7b19a2b\") " pod="openshift-ingress-canary/ingress-canary-xmjhj" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.109417 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/899ec382-6c79-460e-9e3c-9dfb25867855-serving-cert\") pod \"service-ca-operator-777779d784-5bltg\" (UID: \"899ec382-6c79-460e-9e3c-9dfb25867855\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5bltg" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.114066 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.133923 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.145265 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a8c7be2b-608c-4089-b8a6-76bef69c3588-certs\") pod \"machine-config-server-sxbdk\" (UID: \"a8c7be2b-608c-4089-b8a6-76bef69c3588\") " pod="openshift-machine-config-operator/machine-config-server-sxbdk" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.154981 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.168098 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7"] Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.174923 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.184120 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:30 crc kubenswrapper[4713]: E0308 00:09:30.184487 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.684474398 +0000 UTC m=+224.804106631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.188259 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a8c7be2b-608c-4089-b8a6-76bef69c3588-node-bootstrap-token\") pod \"machine-config-server-sxbdk\" (UID: \"a8c7be2b-608c-4089-b8a6-76bef69c3588\") " pod="openshift-machine-config-operator/machine-config-server-sxbdk" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.207164 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfj7m\" (UniqueName: \"kubernetes.io/projected/452f8fcb-d31f-41d4-be85-d041d7efc756-kube-api-access-mfj7m\") pod \"openshift-config-operator-7777fb866f-k5mg9\" (UID: \"452f8fcb-d31f-41d4-be85-d041d7efc756\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.232452 4713 request.go:700] Waited for 1.902487573s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/serviceaccounts/console-operator/token Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.234866 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtmqw\" (UniqueName: \"kubernetes.io/projected/2ab8d84d-9110-4bed-8288-4764d7c10f74-kube-api-access-rtmqw\") pod \"image-pruner-29548800-ghv4d\" (UID: \"2ab8d84d-9110-4bed-8288-4764d7c10f74\") " pod="openshift-image-registry/image-pruner-29548800-ghv4d" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.247317 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfg7d\" (UniqueName: \"kubernetes.io/projected/00793875-21cf-4a6e-8da2-2d94bd3725c4-kube-api-access-hfg7d\") pod \"console-operator-58897d9998-2k6nd\" (UID: \"00793875-21cf-4a6e-8da2-2d94bd3725c4\") " pod="openshift-console-operator/console-operator-58897d9998-2k6nd" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.271709 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d74nj\" (UniqueName: \"kubernetes.io/projected/62cfca3e-2ad8-4964-bd9a-5f907f09ca1e-kube-api-access-d74nj\") pod \"downloads-7954f5f757-z4s84\" (UID: \"62cfca3e-2ad8-4964-bd9a-5f907f09ca1e\") " pod="openshift-console/downloads-7954f5f757-z4s84" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.286352 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:30 crc kubenswrapper[4713]: E0308 00:09:30.286681 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.786660144 +0000 UTC m=+224.906292377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.286750 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:30 crc kubenswrapper[4713]: E0308 00:09:30.287132 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.787115296 +0000 UTC m=+224.906747529 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.288407 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfkqd\" (UniqueName: \"kubernetes.io/projected/1d068555-56f2-4bcf-8b4c-cc574ad087fa-kube-api-access-nfkqd\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.299445 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.310804 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-2k6nd" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.313507 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df45t\" (UniqueName: \"kubernetes.io/projected/69b6d0bc-e512-432d-9a6f-f79318c0f571-kube-api-access-df45t\") pod \"cluster-image-registry-operator-dc59b4c8b-4cd9v\" (UID: \"69b6d0bc-e512-432d-9a6f-f79318c0f571\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.319407 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.329386 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/69b6d0bc-e512-432d-9a6f-f79318c0f571-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4cd9v\" (UID: \"69b6d0bc-e512-432d-9a6f-f79318c0f571\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.340335 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-z4s84" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.349301 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk5fw\" (UniqueName: \"kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-kube-api-access-gk5fw\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.360030 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29548800-ghv4d" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.369369 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-bound-sa-token\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.373674 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.392317 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:30 crc kubenswrapper[4713]: E0308 00:09:30.392841 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.892805461 +0000 UTC m=+225.012437694 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.393545 4713 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.416689 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.452251 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f9a6567-ebe5-4ba9-80ab-a2cd48818942-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mmgvw\" (UID: \"8f9a6567-ebe5-4ba9-80ab-a2cd48818942\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.490436 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rckjk\" (UniqueName: \"kubernetes.io/projected/ccf0e825-0465-40ae-b0ca-f4f7c377e518-kube-api-access-rckjk\") pod \"dns-operator-744455d44c-xr24g\" (UID: \"ccf0e825-0465-40ae-b0ca-f4f7c377e518\") " pod="openshift-dns-operator/dns-operator-744455d44c-xr24g" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.494182 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:30 crc kubenswrapper[4713]: E0308 00:09:30.494774 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.994757943 +0000 UTC m=+225.114390176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.508888 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9tmn\" (UniqueName: \"kubernetes.io/projected/0d2f415a-2626-45f9-baf0-68ab25b9d079-kube-api-access-l9tmn\") pod \"olm-operator-6b444d44fb-8m94r\" (UID: \"0d2f415a-2626-45f9-baf0-68ab25b9d079\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.527270 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmrdb\" (UniqueName: \"kubernetes.io/projected/2be1cb07-55b6-4220-989e-13415c3156b2-kube-api-access-kmrdb\") pod \"openshift-controller-manager-operator-756b6f6bc6-pvc8t\" (UID: \"2be1cb07-55b6-4220-989e-13415c3156b2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.527460 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-gk97q"] Mar 08 00:09:30 crc kubenswrapper[4713]: W0308 00:09:30.536374 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d068555_56f2_4bcf_8b4c_cc574ad087fa.slice/crio-8ff48c0a58bcc4629742d5c5adc29f9d4e6b0e3c6857275419af98c5e780994a WatchSource:0}: Error finding container 8ff48c0a58bcc4629742d5c5adc29f9d4e6b0e3c6857275419af98c5e780994a: Status 404 returned error can't find the container with id 8ff48c0a58bcc4629742d5c5adc29f9d4e6b0e3c6857275419af98c5e780994a Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.546753 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm5dw\" (UniqueName: \"kubernetes.io/projected/5eb834dd-5358-45c4-bbca-50baf0e8656b-kube-api-access-wm5dw\") pod \"catalog-operator-68c6474976-bn56j\" (UID: \"5eb834dd-5358-45c4-bbca-50baf0e8656b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.568752 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p77q9\" (UniqueName: \"kubernetes.io/projected/6e21b584-0781-4fa9-8811-332d42755c17-kube-api-access-p77q9\") pod \"machine-config-controller-84d6567774-shncx\" (UID: \"6e21b584-0781-4fa9-8811-332d42755c17\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-shncx" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.575755 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" event={"ID":"c5cc5125-93f0-4709-afbd-7aa6a888b641","Type":"ContainerStarted","Data":"a68b4ccfdfbaf91b0589175f60e09a31251dadc4c8962143c6e936d1c65c0638"} Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.575809 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" event={"ID":"c5cc5125-93f0-4709-afbd-7aa6a888b641","Type":"ContainerStarted","Data":"4dcd3efc63c2bb82108f5db86db8f7d5ce1c4ffb7c4a91ed149a6c9ab7e1050e"} Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.576740 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.580492 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl" event={"ID":"8e76411a-c4c2-4822-9ec9-a7e73c15f7ec","Type":"ContainerStarted","Data":"3983d2aa68f6da8f44569c63ac9c2a782dcda7998ffa916f2360f4db5f684ce4"} Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.580535 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl" event={"ID":"8e76411a-c4c2-4822-9ec9-a7e73c15f7ec","Type":"ContainerStarted","Data":"62d2206459d89bf6d737d11946b11561a84e0e852500858d623d83d9845ccafa"} Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.583240 4713 generic.go:334] "Generic (PLEG): container finished" podID="bfa92863-23f8-42d4-8e73-433bf546d304" containerID="8d932a99f7ba16281e1a18006ec4ee445f240f93e3a565e114dcfe8b04d9a720" exitCode=0 Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.583310 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-58c66" event={"ID":"bfa92863-23f8-42d4-8e73-433bf546d304","Type":"ContainerDied","Data":"8d932a99f7ba16281e1a18006ec4ee445f240f93e3a565e114dcfe8b04d9a720"} Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.583341 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-58c66" event={"ID":"bfa92863-23f8-42d4-8e73-433bf546d304","Type":"ContainerStarted","Data":"b0ba9787a9b65059ba19235191be65a05b519d232255c85dc8cc1702a1a33dff"} Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.583975 4713 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-7snq7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.584028 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" podUID="c5cc5125-93f0-4709-afbd-7aa6a888b641" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.589615 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6qkt\" (UniqueName: \"kubernetes.io/projected/fd936d68-81ed-4923-8078-5ad0116d532e-kube-api-access-j6qkt\") pod \"migrator-59844c95c7-wld5v\" (UID: \"fd936d68-81ed-4923-8078-5ad0116d532e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wld5v" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.595174 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:30 crc kubenswrapper[4713]: E0308 00:09:30.595498 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:31.095473974 +0000 UTC m=+225.215106197 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.595579 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:30 crc kubenswrapper[4713]: E0308 00:09:30.595926 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:31.095913705 +0000 UTC m=+225.215545938 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.601250 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wld5v" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.604123 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" event={"ID":"c6893b56-2395-4f91-9349-c23b48b957c8","Type":"ContainerStarted","Data":"c3313856c8bd270e779e3471c10a34b6df61acc366568e89bf7663e22bdf4185"} Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.604168 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" event={"ID":"c6893b56-2395-4f91-9349-c23b48b957c8","Type":"ContainerStarted","Data":"3a26477e3ba90b535125524bf64cf9ce159f8050230c417111621ff9c77ef8d0"} Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.604183 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" event={"ID":"c6893b56-2395-4f91-9349-c23b48b957c8","Type":"ContainerStarted","Data":"29a7c4c18a7333fd6b9259f4ff1a952ca8c0aef11eb27d81e32d45184ecd9ba3"} Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.606450 4713 generic.go:334] "Generic (PLEG): container finished" podID="c61cbc0b-441e-4704-accf-35963b3758aa" containerID="ef8e161c1c91b1f5e0788ba38a4581e70a6ba4e4085a7309a178813299b2fd64" exitCode=0 Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.606514 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" event={"ID":"c61cbc0b-441e-4704-accf-35963b3758aa","Type":"ContainerDied","Data":"ef8e161c1c91b1f5e0788ba38a4581e70a6ba4e4085a7309a178813299b2fd64"} Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.606557 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" event={"ID":"c61cbc0b-441e-4704-accf-35963b3758aa","Type":"ContainerStarted","Data":"dd9fcb9296a5e60bcd45c21270f92c6b89629e223159d3ecc2eaaf679c9db764"} Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.607529 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-gk97q" event={"ID":"1d068555-56f2-4bcf-8b4c-cc574ad087fa","Type":"ContainerStarted","Data":"8ff48c0a58bcc4629742d5c5adc29f9d4e6b0e3c6857275419af98c5e780994a"} Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.610130 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" event={"ID":"e4ba1fb6-83e1-4a29-93a5-5abf00f86718","Type":"ContainerStarted","Data":"9536e9b3624c06646894a8bbf0b9ca445d2a94426c01c655b1f4a1a1e29602ba"} Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.610176 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" event={"ID":"e4ba1fb6-83e1-4a29-93a5-5abf00f86718","Type":"ContainerStarted","Data":"a48c3b313279a8d19f79d36e4fdb5a5265b310ba5fe079364f758a6f08817617"} Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.611705 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.612986 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" event={"ID":"10940629-a0dc-4828-a913-20a754f4896b","Type":"ContainerStarted","Data":"27db8f6bfae774d8dea6ec16c8c4cdd7826ed457f2c15b6aa7bcd6ca93f36a27"} Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.613054 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" event={"ID":"10940629-a0dc-4828-a913-20a754f4896b","Type":"ContainerStarted","Data":"2d50ddd00ecb585fac16ea196ec00bce2d2c4db3abf5dd9994fc43c3faed8cad"} Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.618457 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8lcv\" (UniqueName: \"kubernetes.io/projected/c9f8ace1-247f-4128-b3f7-95037fb1a156-kube-api-access-w8lcv\") pod \"machine-approver-56656f9798-tdq97\" (UID: \"c9f8ace1-247f-4128-b3f7-95037fb1a156\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.618645 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.621395 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29548800-ghv4d"] Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.621582 4713 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-4xznw container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.621634 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" podUID="e4ba1fb6-83e1-4a29-93a5-5abf00f86718" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.627990 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-shncx" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.628327 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dspc4\" (UniqueName: \"kubernetes.io/projected/3a74e1e8-3928-4220-b55d-ee42585ef1ee-kube-api-access-dspc4\") pod \"cluster-samples-operator-665b6dd947-6swxn\" (UID: \"3a74e1e8-3928-4220-b55d-ee42585ef1ee\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6swxn" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.628494 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.646898 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.648273 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.651326 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxfnr\" (UniqueName: \"kubernetes.io/projected/9fed4c23-4a16-4502-87eb-d1dd68aa1af5-kube-api-access-qxfnr\") pod \"multus-admission-controller-857f4d67dd-2qwgb\" (UID: \"9fed4c23-4a16-4502-87eb-d1dd68aa1af5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2qwgb" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.682485 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d3811a82-b0fe-4e06-948a-79cbbc840a98-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bltk5\" (UID: \"d3811a82-b0fe-4e06-948a-79cbbc840a98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.692137 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g45bj\" (UniqueName: \"kubernetes.io/projected/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-kube-api-access-g45bj\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.697993 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.698122 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:30 crc kubenswrapper[4713]: E0308 00:09:30.698541 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:31.198525973 +0000 UTC m=+225.318158206 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.710174 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-795x2\" (UniqueName: \"kubernetes.io/projected/9e570b68-8b4c-42e3-839d-f37943999246-kube-api-access-795x2\") pod \"marketplace-operator-79b997595-p9hqz\" (UID: \"9e570b68-8b4c-42e3-839d-f37943999246\") " pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.710439 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.726264 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2k6nd"] Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.726816 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-xr24g" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.734615 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6wlf\" (UniqueName: \"kubernetes.io/projected/141fc694-b9ce-4b84-9e39-0e79a487e398-kube-api-access-j6wlf\") pod \"kube-storage-version-migrator-operator-b67b599dd-zvsbq\" (UID: \"141fc694-b9ce-4b84-9e39-0e79a487e398\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.751640 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tch6h\" (UniqueName: \"kubernetes.io/projected/496a4fbf-c338-4b64-96a5-dda456094c28-kube-api-access-tch6h\") pod \"machine-config-operator-74547568cd-q7bjv\" (UID: \"496a4fbf-c338-4b64-96a5-dda456094c28\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.770878 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp6ps\" (UniqueName: \"kubernetes.io/projected/c9df8d9c-b59f-4a1c-9fb4-668123290569-kube-api-access-mp6ps\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.790922 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k27jn\" (UniqueName: \"kubernetes.io/projected/d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4-kube-api-access-k27jn\") pod \"package-server-manager-789f6589d5-h5mxt\" (UID: \"d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.795783 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-z4s84"] Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.797719 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9"] Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.799668 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:30 crc kubenswrapper[4713]: E0308 00:09:30.800140 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:31.300119956 +0000 UTC m=+225.419752259 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:30 crc kubenswrapper[4713]: W0308 00:09:30.801717 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00793875_21cf_4a6e_8da2_2d94bd3725c4.slice/crio-f4a25a1d552f9b27130e4a2325b1c7b384ce6efa15ac9ae4b909274ad89af8cf WatchSource:0}: Error finding container f4a25a1d552f9b27130e4a2325b1c7b384ce6efa15ac9ae4b909274ad89af8cf: Status 404 returned error can't find the container with id f4a25a1d552f9b27130e4a2325b1c7b384ce6efa15ac9ae4b909274ad89af8cf Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.819722 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d587l\" (UniqueName: \"kubernetes.io/projected/d3811a82-b0fe-4e06-948a-79cbbc840a98-kube-api-access-d587l\") pod \"ingress-operator-5b745b69d9-bltk5\" (UID: \"d3811a82-b0fe-4e06-948a-79cbbc840a98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.820285 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wld5v"] Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.830465 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmkds\" (UniqueName: \"kubernetes.io/projected/548e19ee-14eb-4075-b9e3-69178800837c-kube-api-access-wmkds\") pod \"router-default-5444994796-drs4q\" (UID: \"548e19ee-14eb-4075-b9e3-69178800837c\") " pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.836569 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq" Mar 08 00:09:30 crc kubenswrapper[4713]: W0308 00:09:30.838380 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62cfca3e_2ad8_4964_bd9a_5f907f09ca1e.slice/crio-d0161141af7255dad686f4f84bc54018c222652a3d7e33ad5ffe56ff73f94e21 WatchSource:0}: Error finding container d0161141af7255dad686f4f84bc54018c222652a3d7e33ad5ffe56ff73f94e21: Status 404 returned error can't find the container with id d0161141af7255dad686f4f84bc54018c222652a3d7e33ad5ffe56ff73f94e21 Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.848786 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jhxcl\" (UID: \"cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.867791 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.869750 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0dbf7b38-8980-49e5-956c-08e443912846-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4p529\" (UID: \"0dbf7b38-8980-49e5-956c-08e443912846\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.877152 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6swxn" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.885238 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.892040 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.894570 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nk4f\" (UniqueName: \"kubernetes.io/projected/f878574f-5b4a-4a3f-9b2b-e8eeb569f0fc-kube-api-access-7nk4f\") pod \"control-plane-machine-set-operator-78cbb6b69f-7wd77\" (UID: \"f878574f-5b4a-4a3f-9b2b-e8eeb569f0fc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wd77" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.901552 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:30 crc kubenswrapper[4713]: E0308 00:09:30.901984 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:31.401964625 +0000 UTC m=+225.521596858 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.910067 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2qwgb" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.917293 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.919465 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrkff\" (UniqueName: \"kubernetes.io/projected/fdccd72c-79d7-4388-926e-0539c571dafe-kube-api-access-hrkff\") pod \"auto-csr-approver-29548808-nd57l\" (UID: \"fdccd72c-79d7-4388-926e-0539c571dafe\") " pod="openshift-infra/auto-csr-approver-29548808-nd57l" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.932095 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.933913 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.940464 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.954544 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.961642 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.974297 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.977893 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.989584 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.003860 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:31 crc kubenswrapper[4713]: E0308 00:09:31.004303 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:31.504288676 +0000 UTC m=+225.623920909 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.006294 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-shncx"] Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.008629 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.013815 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsn7h\" (UniqueName: \"kubernetes.io/projected/a8c7be2b-608c-4089-b8a6-76bef69c3588-kube-api-access-bsn7h\") pod \"machine-config-server-sxbdk\" (UID: \"a8c7be2b-608c-4089-b8a6-76bef69c3588\") " pod="openshift-machine-config-operator/machine-config-server-sxbdk" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.032539 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8vz2\" (UniqueName: \"kubernetes.io/projected/063a79dd-fbe8-4562-98bc-deb309b25182-kube-api-access-m8vz2\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.043523 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-xr24g"] Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.048725 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpf9l\" (UniqueName: \"kubernetes.io/projected/ee63f184-4609-43d4-bdc1-2c840aef6d7f-kube-api-access-rpf9l\") pod \"service-ca-9c57cc56f-c4nq5\" (UID: \"ee63f184-4609-43d4-bdc1-2c840aef6d7f\") " pod="openshift-service-ca/service-ca-9c57cc56f-c4nq5" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.058768 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548808-nd57l" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.073304 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmkpx\" (UniqueName: \"kubernetes.io/projected/39da2ba4-aebb-485b-8e46-7ffc36efa490-kube-api-access-mmkpx\") pod \"dns-default-lwhnh\" (UID: \"39da2ba4-aebb-485b-8e46-7ffc36efa490\") " pod="openshift-dns/dns-default-lwhnh" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.076313 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-c4nq5" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.105211 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j"] Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.110608 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvzk6\" (UniqueName: \"kubernetes.io/projected/158ba4b3-9da3-4a83-95dd-e625c7b19a2b-kube-api-access-lvzk6\") pod \"ingress-canary-xmjhj\" (UID: \"158ba4b3-9da3-4a83-95dd-e625c7b19a2b\") " pod="openshift-ingress-canary/ingress-canary-xmjhj" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.115601 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xmjhj" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.116531 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.116596 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lwhnh" Mar 08 00:09:31 crc kubenswrapper[4713]: E0308 00:09:31.116727 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:31.616702211 +0000 UTC m=+225.736334444 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.117120 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.117502 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-sxbdk" Mar 08 00:09:31 crc kubenswrapper[4713]: E0308 00:09:31.119897 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:31.61986959 +0000 UTC m=+225.739501823 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.123189 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.144406 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x9m6\" (UniqueName: \"kubernetes.io/projected/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-kube-api-access-4x9m6\") pod \"packageserver-d55dfcdfc-g99pk\" (UID: \"3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.146794 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-268pq\" (UniqueName: \"kubernetes.io/projected/899ec382-6c79-460e-9e3c-9dfb25867855-kube-api-access-268pq\") pod \"service-ca-operator-777779d784-5bltg\" (UID: \"899ec382-6c79-460e-9e3c-9dfb25867855\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5bltg" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.193617 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wd77" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.218351 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.218579 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a04a017-1594-43d7-a796-8c676b28095e-config-volume\") pod \"collect-profiles-29548800-cclv4\" (UID: \"2a04a017-1594-43d7-a796-8c676b28095e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.218622 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a04a017-1594-43d7-a796-8c676b28095e-secret-volume\") pod \"collect-profiles-29548800-cclv4\" (UID: \"2a04a017-1594-43d7-a796-8c676b28095e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.218682 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l55j\" (UniqueName: \"kubernetes.io/projected/2a04a017-1594-43d7-a796-8c676b28095e-kube-api-access-5l55j\") pod \"collect-profiles-29548800-cclv4\" (UID: \"2a04a017-1594-43d7-a796-8c676b28095e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" Mar 08 00:09:31 crc kubenswrapper[4713]: E0308 00:09:31.218775 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:31.718760865 +0000 UTC m=+225.838393098 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.284151 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw"] Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.322348 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a04a017-1594-43d7-a796-8c676b28095e-config-volume\") pod \"collect-profiles-29548800-cclv4\" (UID: \"2a04a017-1594-43d7-a796-8c676b28095e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.322485 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.322534 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a04a017-1594-43d7-a796-8c676b28095e-secret-volume\") pod \"collect-profiles-29548800-cclv4\" (UID: \"2a04a017-1594-43d7-a796-8c676b28095e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.323211 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a04a017-1594-43d7-a796-8c676b28095e-config-volume\") pod \"collect-profiles-29548800-cclv4\" (UID: \"2a04a017-1594-43d7-a796-8c676b28095e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.323475 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l55j\" (UniqueName: \"kubernetes.io/projected/2a04a017-1594-43d7-a796-8c676b28095e-kube-api-access-5l55j\") pod \"collect-profiles-29548800-cclv4\" (UID: \"2a04a017-1594-43d7-a796-8c676b28095e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" Mar 08 00:09:31 crc kubenswrapper[4713]: E0308 00:09:31.323640 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:31.82362732 +0000 UTC m=+225.943259553 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.328463 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a04a017-1594-43d7-a796-8c676b28095e-secret-volume\") pod \"collect-profiles-29548800-cclv4\" (UID: \"2a04a017-1594-43d7-a796-8c676b28095e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.369451 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.371031 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l55j\" (UniqueName: \"kubernetes.io/projected/2a04a017-1594-43d7-a796-8c676b28095e-kube-api-access-5l55j\") pod \"collect-profiles-29548800-cclv4\" (UID: \"2a04a017-1594-43d7-a796-8c676b28095e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.390499 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5bltg" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.394463 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r"] Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.397196 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v"] Mar 08 00:09:31 crc kubenswrapper[4713]: W0308 00:09:31.418231 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod548e19ee_14eb_4075_b9e3_69178800837c.slice/crio-07687e468c691ac7ff50057d0bbfea873d5edf04cd2b2be0edb2606e41e054f9 WatchSource:0}: Error finding container 07687e468c691ac7ff50057d0bbfea873d5edf04cd2b2be0edb2606e41e054f9: Status 404 returned error can't find the container with id 07687e468c691ac7ff50057d0bbfea873d5edf04cd2b2be0edb2606e41e054f9 Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.432064 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:31 crc kubenswrapper[4713]: E0308 00:09:31.432225 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:31.932198948 +0000 UTC m=+226.051831181 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.432271 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:31 crc kubenswrapper[4713]: E0308 00:09:31.432563 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:31.932550797 +0000 UTC m=+226.052183030 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.533608 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:31 crc kubenswrapper[4713]: E0308 00:09:31.533883 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:32.033848172 +0000 UTC m=+226.153480395 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.534147 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:31 crc kubenswrapper[4713]: E0308 00:09:31.534456 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:32.034445127 +0000 UTC m=+226.154077360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.535945 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t"] Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.635235 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq"] Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.637068 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.637321 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv"] Mar 08 00:09:31 crc kubenswrapper[4713]: E0308 00:09:31.637466 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:32.137445185 +0000 UTC m=+226.257077418 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.657753 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl"] Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.657957 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wld5v" event={"ID":"fd936d68-81ed-4923-8078-5ad0116d532e","Type":"ContainerStarted","Data":"2b04c5dba8341e7071b5a25348b24b8fe49f2fa0f49283898e52f444691e4c4d"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.662570 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-gk97q" event={"ID":"1d068555-56f2-4bcf-8b4c-cc574ad087fa","Type":"ContainerStarted","Data":"9d8dc8439406027f01ceb6aedaedc6496607794c932cf2f7e302ec056f77213e"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.663195 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.702463 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-drs4q" event={"ID":"548e19ee-14eb-4075-b9e3-69178800837c","Type":"ContainerStarted","Data":"07687e468c691ac7ff50057d0bbfea873d5edf04cd2b2be0edb2606e41e054f9"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.723529 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4qpfj"] Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.726501 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-2k6nd" event={"ID":"00793875-21cf-4a6e-8da2-2d94bd3725c4","Type":"ContainerStarted","Data":"4202dd9aed16f2668e430b9808f118d1000f996e9ab98c6807453d6e03386ad7"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.727187 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-2k6nd" event={"ID":"00793875-21cf-4a6e-8da2-2d94bd3725c4","Type":"ContainerStarted","Data":"f4a25a1d552f9b27130e4a2325b1c7b384ce6efa15ac9ae4b909274ad89af8cf"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.728426 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-2k6nd" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.738015 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29548800-ghv4d" event={"ID":"2ab8d84d-9110-4bed-8288-4764d7c10f74","Type":"ContainerStarted","Data":"f9566defd908e4b2b14ead5994a9afb7bc984f75e3c8235a78747cca1c95babf"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.738075 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29548800-ghv4d" event={"ID":"2ab8d84d-9110-4bed-8288-4764d7c10f74","Type":"ContainerStarted","Data":"6fbb096291ab484496304a21d48e0c187a353974f802449b0a324f5c483976f8"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.744164 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.744508 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2qwgb"] Mar 08 00:09:31 crc kubenswrapper[4713]: E0308 00:09:31.744898 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:32.244870325 +0000 UTC m=+226.364502558 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.759667 4713 patch_prober.go:28] interesting pod/console-operator-58897d9998-2k6nd container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.759716 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-2k6nd" podUID="00793875-21cf-4a6e-8da2-2d94bd3725c4" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.764660 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529"] Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.770739 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" event={"ID":"452f8fcb-d31f-41d4-be85-d041d7efc756","Type":"ContainerStarted","Data":"c14731dbfabd77f2630c53172ea07e30cf12a7520235295ed5978f0dac04e3b1"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.770775 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" event={"ID":"452f8fcb-d31f-41d4-be85-d041d7efc756","Type":"ContainerStarted","Data":"7d557f649440aa7d8979e239e3dbc43be1e038a5d177bc7e9b64392203cbedb0"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.790570 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" event={"ID":"0d2f415a-2626-45f9-baf0-68ab25b9d079","Type":"ContainerStarted","Data":"b986c3ea3367f0d8e16ad232b5d65a39e5e8c1b421c8da06daf14ef57c0db285"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.792861 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" event={"ID":"5eb834dd-5358-45c4-bbca-50baf0e8656b","Type":"ContainerStarted","Data":"0d44a394195fcb4f42952a22e57cfd1a6b6f0db20a3d3d6af4abfcb58f3829f3"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.801687 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" event={"ID":"c61cbc0b-441e-4704-accf-35963b3758aa","Type":"ContainerStarted","Data":"f069cfc486387c5cde34f5afb6cecf83b6fb955230bf1ce769adaf1a981ffba9"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.815052 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-z4s84" event={"ID":"62cfca3e-2ad8-4964-bd9a-5f907f09ca1e","Type":"ContainerStarted","Data":"0e456590ed6aec138d6c2be36909b347ef8e66d85928a8221898c7ed939f09c4"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.815105 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-z4s84" event={"ID":"62cfca3e-2ad8-4964-bd9a-5f907f09ca1e","Type":"ContainerStarted","Data":"d0161141af7255dad686f4f84bc54018c222652a3d7e33ad5ffe56ff73f94e21"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.815582 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-z4s84" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.817147 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-z4s84 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.817189 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.821201 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-xr24g" event={"ID":"ccf0e825-0465-40ae-b0ca-f4f7c377e518","Type":"ContainerStarted","Data":"d3d0b0a012b975a10283fb4300f9ac3db386cac5a5ffd9a8c67b5efdc2cdbb7b"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.822460 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" event={"ID":"69b6d0bc-e512-432d-9a6f-f79318c0f571","Type":"ContainerStarted","Data":"0f516b5e95ed11912f6e66ecaa5c09eef45730e1cf73f99b3a1ddd5f02aad27c"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.823265 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw" event={"ID":"8f9a6567-ebe5-4ba9-80ab-a2cd48818942","Type":"ContainerStarted","Data":"1b788d5a6379b1837b939a466d3a77ee20b5cff8b78cf4ee661310e5c54b3d13"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.825453 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-c4nq5"] Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.826024 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-shncx" event={"ID":"6e21b584-0781-4fa9-8811-332d42755c17","Type":"ContainerStarted","Data":"cfd46bb74a3a2e4e75cc309902049244faa91690798932d4b2acdf457dc24654"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.830313 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-58c66" event={"ID":"bfa92863-23f8-42d4-8e73-433bf546d304","Type":"ContainerStarted","Data":"acfab74d0e5aa0f60ce6b65943323fce0eb8ed34518b52222ec8a0203d809698"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.832373 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" event={"ID":"c9f8ace1-247f-4128-b3f7-95037fb1a156","Type":"ContainerStarted","Data":"aa86b13cd04527b13fb395768d0e88b7a726c753651d2b1f343d3553ca45cc9c"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.832906 4713 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-7snq7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.832955 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" podUID="c5cc5125-93f0-4709-afbd-7aa6a888b641" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.833076 4713 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-4xznw container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.833120 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" podUID="e4ba1fb6-83e1-4a29-93a5-5abf00f86718" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.845598 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:31 crc kubenswrapper[4713]: E0308 00:09:31.845762 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:32.345730939 +0000 UTC m=+226.465363182 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.846142 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:31 crc kubenswrapper[4713]: E0308 00:09:31.846435 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:32.346426976 +0000 UTC m=+226.466059209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:31 crc kubenswrapper[4713]: W0308 00:09:31.914049 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod141fc694_b9ce_4b84_9e39_0e79a487e398.slice/crio-a4b9a606d9fdab7476c0a6affdc78e2ff079905daef0c4e0b4ceda9a089c39d6 WatchSource:0}: Error finding container a4b9a606d9fdab7476c0a6affdc78e2ff079905daef0c4e0b4ceda9a089c39d6: Status 404 returned error can't find the container with id a4b9a606d9fdab7476c0a6affdc78e2ff079905daef0c4e0b4ceda9a089c39d6 Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.934867 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt"] Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.936106 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p9hqz"] Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.941037 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6swxn"] Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.950239 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:31 crc kubenswrapper[4713]: E0308 00:09:31.951673 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:32.45165271 +0000 UTC m=+226.571284953 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.014995 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" podStartSLOduration=155.014973791 podStartE2EDuration="2m35.014973791s" podCreationTimestamp="2026-03-08 00:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:32.01331257 +0000 UTC m=+226.132944803" watchObservedRunningTime="2026-03-08 00:09:32.014973791 +0000 UTC m=+226.134606024" Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.040610 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xmjhj"] Mar 08 00:09:32 crc kubenswrapper[4713]: W0308 00:09:32.051172 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dbf7b38_8980_49e5_956c_08e443912846.slice/crio-4e425132b6bddb6f03bc89cb121ccf34d1db0552ad0b4d517b5706e92cc33ab3 WatchSource:0}: Error finding container 4e425132b6bddb6f03bc89cb121ccf34d1db0552ad0b4d517b5706e92cc33ab3: Status 404 returned error can't find the container with id 4e425132b6bddb6f03bc89cb121ccf34d1db0552ad0b4d517b5706e92cc33ab3 Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.051850 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:32 crc kubenswrapper[4713]: E0308 00:09:32.052211 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:32.552200617 +0000 UTC m=+226.671832850 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.073103 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" podStartSLOduration=154.073080621 podStartE2EDuration="2m34.073080621s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:32.068221319 +0000 UTC m=+226.187853552" watchObservedRunningTime="2026-03-08 00:09:32.073080621 +0000 UTC m=+226.192712864" Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.074863 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5"] Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.153006 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:32 crc kubenswrapper[4713]: E0308 00:09:32.153530 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:32.653497362 +0000 UTC m=+226.773129595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:32 crc kubenswrapper[4713]: W0308 00:09:32.186214 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod158ba4b3_9da3_4a83_95dd_e625c7b19a2b.slice/crio-e8a3e872d40d500d6f0874070ede52356ca1f0983fc3d005e18d1ae2ddedd2f0 WatchSource:0}: Error finding container e8a3e872d40d500d6f0874070ede52356ca1f0983fc3d005e18d1ae2ddedd2f0: Status 404 returned error can't find the container with id e8a3e872d40d500d6f0874070ede52356ca1f0983fc3d005e18d1ae2ddedd2f0 Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.187537 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c8gbn"] Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.214797 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lwhnh"] Mar 08 00:09:32 crc kubenswrapper[4713]: W0308 00:09:32.216118 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3811a82_b0fe_4e06_948a_79cbbc840a98.slice/crio-7df7f4d33d83755772a6cd1dc146a40e86d3bcef9e2facebd3acdd5f7346cddc WatchSource:0}: Error finding container 7df7f4d33d83755772a6cd1dc146a40e86d3bcef9e2facebd3acdd5f7346cddc: Status 404 returned error can't find the container with id 7df7f4d33d83755772a6cd1dc146a40e86d3bcef9e2facebd3acdd5f7346cddc Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.254708 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:32 crc kubenswrapper[4713]: W0308 00:09:32.255452 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39da2ba4_aebb_485b_8e46_7ffc36efa490.slice/crio-b1f7244b40627128be2dcf7963c65b437ef73ede8622ce1ef24a4d1d33b02497 WatchSource:0}: Error finding container b1f7244b40627128be2dcf7963c65b437ef73ede8622ce1ef24a4d1d33b02497: Status 404 returned error can't find the container with id b1f7244b40627128be2dcf7963c65b437ef73ede8622ce1ef24a4d1d33b02497 Mar 08 00:09:32 crc kubenswrapper[4713]: E0308 00:09:32.256248 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:32.756230883 +0000 UTC m=+226.875863126 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.303561 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-q84x9"] Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.312747 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548808-nd57l"] Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.356032 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:32 crc kubenswrapper[4713]: E0308 00:09:32.356454 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:32.856424631 +0000 UTC m=+226.976056864 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.379075 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk"] Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.409106 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wd77"] Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.457945 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:32 crc kubenswrapper[4713]: E0308 00:09:32.458285 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:32.95826781 +0000 UTC m=+227.077900043 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.495274 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4"] Mar 08 00:09:32 crc kubenswrapper[4713]: W0308 00:09:32.502705 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod063a79dd_fbe8_4562_98bc_deb309b25182.slice/crio-fe67e4e82591b9266983190fce32b17f5c4383bc0b4f0ec37160261fdf04da6e WatchSource:0}: Error finding container fe67e4e82591b9266983190fce32b17f5c4383bc0b4f0ec37160261fdf04da6e: Status 404 returned error can't find the container with id fe67e4e82591b9266983190fce32b17f5c4383bc0b4f0ec37160261fdf04da6e Mar 08 00:09:32 crc kubenswrapper[4713]: W0308 00:09:32.503219 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdccd72c_79d7_4388_926e_0539c571dafe.slice/crio-0af707d82a061d622eec317592ad4179a6046c0ac5a6b6a6071ecbfdd53ddeaa WatchSource:0}: Error finding container 0af707d82a061d622eec317592ad4179a6046c0ac5a6b6a6071ecbfdd53ddeaa: Status 404 returned error can't find the container with id 0af707d82a061d622eec317592ad4179a6046c0ac5a6b6a6071ecbfdd53ddeaa Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.505460 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.510155 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-5bltg"] Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.560152 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:32 crc kubenswrapper[4713]: E0308 00:09:32.560793 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:33.060770236 +0000 UTC m=+227.180402469 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.645576 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl" podStartSLOduration=155.645556156 podStartE2EDuration="2m35.645556156s" podCreationTimestamp="2026-03-08 00:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:32.642878949 +0000 UTC m=+226.762511202" watchObservedRunningTime="2026-03-08 00:09:32.645556156 +0000 UTC m=+226.765188389" Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.661351 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:32 crc kubenswrapper[4713]: E0308 00:09:32.661749 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:33.161721222 +0000 UTC m=+227.281353455 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.700324 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-gk97q" podStartSLOduration=154.700296691 podStartE2EDuration="2m34.700296691s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:32.69546839 +0000 UTC m=+226.815100623" watchObservedRunningTime="2026-03-08 00:09:32.700296691 +0000 UTC m=+226.819928924" Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.762951 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:32 crc kubenswrapper[4713]: E0308 00:09:32.763160 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:33.26312556 +0000 UTC m=+227.382757793 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.763282 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:32 crc kubenswrapper[4713]: E0308 00:09:32.763580 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:33.263568581 +0000 UTC m=+227.383200814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.842616 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-shncx" event={"ID":"6e21b584-0781-4fa9-8811-332d42755c17","Type":"ContainerStarted","Data":"24fb7b611b2bba7816e13ffd395a56cee4b640ca9e46deb1afb7b067011d4ee1"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.844321 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" event={"ID":"0d2f415a-2626-45f9-baf0-68ab25b9d079","Type":"ContainerStarted","Data":"babe5ff1551993631dbb59509786ee87fc512912b19e1ab02fc1f3a5e61a47dc"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.845361 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-q84x9" event={"ID":"063a79dd-fbe8-4562-98bc-deb309b25182","Type":"ContainerStarted","Data":"fe67e4e82591b9266983190fce32b17f5c4383bc0b4f0ec37160261fdf04da6e"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.846837 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" event={"ID":"3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6","Type":"ContainerStarted","Data":"9fbc51b29e200e46787490449f1137ed821ea23125402318a6489ea2356fff8e"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.848319 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xmjhj" event={"ID":"158ba4b3-9da3-4a83-95dd-e625c7b19a2b","Type":"ContainerStarted","Data":"e8a3e872d40d500d6f0874070ede52356ca1f0983fc3d005e18d1ae2ddedd2f0"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.854741 4713 generic.go:334] "Generic (PLEG): container finished" podID="452f8fcb-d31f-41d4-be85-d041d7efc756" containerID="c14731dbfabd77f2630c53172ea07e30cf12a7520235295ed5978f0dac04e3b1" exitCode=0 Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.854835 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" event={"ID":"452f8fcb-d31f-41d4-be85-d041d7efc756","Type":"ContainerDied","Data":"c14731dbfabd77f2630c53172ea07e30cf12a7520235295ed5978f0dac04e3b1"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.857422 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" event={"ID":"9e570b68-8b4c-42e3-839d-f37943999246","Type":"ContainerStarted","Data":"8a2d896d73aedf449a67c5c1becd624d05fd0cc1bac64192c1528302ec9e1810"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.860521 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wd77" event={"ID":"f878574f-5b4a-4a3f-9b2b-e8eeb569f0fc","Type":"ContainerStarted","Data":"4f735baf03071a713358d5084a3ed1c39a064b786c0c8aab2cec625051e1bf4f"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.864251 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:32 crc kubenswrapper[4713]: E0308 00:09:32.864631 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:33.3646113 +0000 UTC m=+227.484243533 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.868112 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" event={"ID":"d3811a82-b0fe-4e06-948a-79cbbc840a98","Type":"ContainerStarted","Data":"7df7f4d33d83755772a6cd1dc146a40e86d3bcef9e2facebd3acdd5f7346cddc"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.871790 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wld5v" event={"ID":"fd936d68-81ed-4923-8078-5ad0116d532e","Type":"ContainerStarted","Data":"c1cc2bd2761912bed0bce72c583ff4a3ce293060ab546c49da1234cb5b624829"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.873847 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-c4nq5" event={"ID":"ee63f184-4609-43d4-bdc1-2c840aef6d7f","Type":"ContainerStarted","Data":"1403c7f7c82104f1fb2d5acbca121b2f621f34934f6c942ece623278837b82a7"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.875557 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" event={"ID":"5eb834dd-5358-45c4-bbca-50baf0e8656b","Type":"ContainerStarted","Data":"7768995058b6d14ec7324fef4fdf9eb4130adf2619a94fd9384329ad45f0dda9"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.876778 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw" event={"ID":"8f9a6567-ebe5-4ba9-80ab-a2cd48818942","Type":"ContainerStarted","Data":"b8cee89ff59a87f0aef0cba5e55318481207c1684c87c8c0e24a463d0b451164"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.877792 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" event={"ID":"c9df8d9c-b59f-4a1c-9fb4-668123290569","Type":"ContainerStarted","Data":"e0d410e7c38a223bcd0189e0430b8bd6e62ba561f8515070eac1a52a52fdb35d"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.879278 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-sxbdk" event={"ID":"a8c7be2b-608c-4089-b8a6-76bef69c3588","Type":"ContainerStarted","Data":"1a02a4260c82b95218d95b7ec0f782a08c30d534af39889958bef08ce68a1906"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.879305 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-sxbdk" event={"ID":"a8c7be2b-608c-4089-b8a6-76bef69c3588","Type":"ContainerStarted","Data":"51ebfe85afcc3b7f2946066c966ebdfb5ef2285578327fa0c1fd2331c75de2e5"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.880599 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-xr24g" event={"ID":"ccf0e825-0465-40ae-b0ca-f4f7c377e518","Type":"ContainerStarted","Data":"0ff3f228823f254df81c9400e3bf969b1989214eb6d53eeaa806767239498a57"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.881799 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl" event={"ID":"cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0","Type":"ContainerStarted","Data":"2b23ab3e26964ba12243f80dc785e3757a7616b853625567abe3a07d108fa2ab"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.883443 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq" event={"ID":"141fc694-b9ce-4b84-9e39-0e79a487e398","Type":"ContainerStarted","Data":"d73551542a94ae92898d6c7f60f43b5e7b07f43a7fae03dedec4b045380c2e9a"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.883475 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq" event={"ID":"141fc694-b9ce-4b84-9e39-0e79a487e398","Type":"ContainerStarted","Data":"a4b9a606d9fdab7476c0a6affdc78e2ff079905daef0c4e0b4ceda9a089c39d6"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.884633 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t" event={"ID":"2be1cb07-55b6-4220-989e-13415c3156b2","Type":"ContainerStarted","Data":"8df7f254cdc361cd7a84eb9568ef8a92c58bfb920fd5787cd92bbf9eb19b0868"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.885449 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6swxn" event={"ID":"3a74e1e8-3928-4220-b55d-ee42585ef1ee","Type":"ContainerStarted","Data":"7503f9d76e1ead024b2d9e32c270ed5c7994c52e76c635dedfba01368986250e"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.887192 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-58c66" event={"ID":"bfa92863-23f8-42d4-8e73-433bf546d304","Type":"ContainerStarted","Data":"a2bffc41930aae799298676f6731be7f1a78453e81f87a04e4c86069af5275cd"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.891632 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" podStartSLOduration=154.891604888 podStartE2EDuration="2m34.891604888s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:32.88887093 +0000 UTC m=+227.008503173" watchObservedRunningTime="2026-03-08 00:09:32.891604888 +0000 UTC m=+227.011237111" Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.898970 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" event={"ID":"c9f8ace1-247f-4128-b3f7-95037fb1a156","Type":"ContainerStarted","Data":"112d9d26a15ed14170c83bc124ad4a214a7baca62e66a05d9828873540b36a76"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.899989 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt" event={"ID":"d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4","Type":"ContainerStarted","Data":"e5e4ce108e48921131f575c6266cdd05f448c77b1476fcea8f79ebd51be164e8"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.902038 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529" event={"ID":"0dbf7b38-8980-49e5-956c-08e443912846","Type":"ContainerStarted","Data":"4e425132b6bddb6f03bc89cb121ccf34d1db0552ad0b4d517b5706e92cc33ab3"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.902938 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2qwgb" event={"ID":"9fed4c23-4a16-4502-87eb-d1dd68aa1af5","Type":"ContainerStarted","Data":"9a9c988848cea61452547df38ee81f4d9d10b67c33f46376e69f961257d0ca10"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.904246 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-drs4q" event={"ID":"548e19ee-14eb-4075-b9e3-69178800837c","Type":"ContainerStarted","Data":"5fba0849bd6ff6d74f814a7c60b06c8112cccf8bb3be1dcd07c57c070cebdb3a"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.905354 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" event={"ID":"69b6d0bc-e512-432d-9a6f-f79318c0f571","Type":"ContainerStarted","Data":"a45c92beedbf0140113aefd9290a111f882f2b9dd8f6241440aabf1ff34df979"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.906163 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548808-nd57l" event={"ID":"fdccd72c-79d7-4388-926e-0539c571dafe","Type":"ContainerStarted","Data":"0af707d82a061d622eec317592ad4179a6046c0ac5a6b6a6071ecbfdd53ddeaa"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.906904 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lwhnh" event={"ID":"39da2ba4-aebb-485b-8e46-7ffc36efa490","Type":"ContainerStarted","Data":"b1f7244b40627128be2dcf7963c65b437ef73ede8622ce1ef24a4d1d33b02497"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.908428 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" event={"ID":"0e43994e-0aa1-4541-bce9-502bbc1dc0a0","Type":"ContainerStarted","Data":"4463a907ae7393ef0e3efdac52e43e38ff1a3c88f6572b9c8af64744303321a8"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.909210 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" event={"ID":"496a4fbf-c338-4b64-96a5-dda456094c28","Type":"ContainerStarted","Data":"190a60dacb57686f7527fd359dcbee53cb27d86651b512ad3ef2e82c71e60229"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.909967 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5bltg" event={"ID":"899ec382-6c79-460e-9e3c-9dfb25867855","Type":"ContainerStarted","Data":"be2e83b64ebb1f15ce7422655ee6ab80fd10154ea455c673dcb802f1fea0d293"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.910787 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" event={"ID":"2a04a017-1594-43d7-a796-8c676b28095e","Type":"ContainerStarted","Data":"f170f29d26ed2ed2fc88befac7041785958542192c67ab73459f56dea209da08"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.911211 4713 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-7snq7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.911427 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-z4s84 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.911466 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.911614 4713 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-4xznw container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.911642 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" podUID="e4ba1fb6-83e1-4a29-93a5-5abf00f86718" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.911874 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" podUID="c5cc5125-93f0-4709-afbd-7aa6a888b641" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.911881 4713 patch_prober.go:28] interesting pod/console-operator-58897d9998-2k6nd container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.911917 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-2k6nd" podUID="00793875-21cf-4a6e-8da2-2d94bd3725c4" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.965298 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:32 crc kubenswrapper[4713]: E0308 00:09:32.965608 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:33.465594437 +0000 UTC m=+227.585226670 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.066592 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:33 crc kubenswrapper[4713]: E0308 00:09:33.066744 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:33.566711638 +0000 UTC m=+227.686343871 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.067332 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:33 crc kubenswrapper[4713]: E0308 00:09:33.068106 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:33.568096083 +0000 UTC m=+227.687728316 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.168890 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:33 crc kubenswrapper[4713]: E0308 00:09:33.169211 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:33.669193093 +0000 UTC m=+227.788825316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.269727 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" podStartSLOduration=155.269711739 podStartE2EDuration="2m35.269711739s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:33.26775911 +0000 UTC m=+227.387391343" watchObservedRunningTime="2026-03-08 00:09:33.269711739 +0000 UTC m=+227.389343972" Mar 08 00:09:33 crc kubenswrapper[4713]: E0308 00:09:33.270087 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:33.770076548 +0000 UTC m=+227.889708781 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.269849 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.326322 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-z4s84" podStartSLOduration=155.326305141 podStartE2EDuration="2m35.326305141s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:33.322121426 +0000 UTC m=+227.441753679" watchObservedRunningTime="2026-03-08 00:09:33.326305141 +0000 UTC m=+227.445937374" Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.361043 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29548800-ghv4d" podStartSLOduration=156.361025893 podStartE2EDuration="2m36.361025893s" podCreationTimestamp="2026-03-08 00:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:33.359119835 +0000 UTC m=+227.478752068" watchObservedRunningTime="2026-03-08 00:09:33.361025893 +0000 UTC m=+227.480658126" Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.371129 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:33 crc kubenswrapper[4713]: E0308 00:09:33.371472 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:33.871457515 +0000 UTC m=+227.991089748 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.371681 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:33 crc kubenswrapper[4713]: E0308 00:09:33.371951 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:33.871942778 +0000 UTC m=+227.991575011 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.404689 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" podStartSLOduration=155.40467388 podStartE2EDuration="2m35.40467388s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:33.402486245 +0000 UTC m=+227.522118498" watchObservedRunningTime="2026-03-08 00:09:33.40467388 +0000 UTC m=+227.524306113" Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.444707 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-2k6nd" podStartSLOduration=155.444691366 podStartE2EDuration="2m35.444691366s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:33.442265945 +0000 UTC m=+227.561898178" watchObservedRunningTime="2026-03-08 00:09:33.444691366 +0000 UTC m=+227.564323609" Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.473171 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:33 crc kubenswrapper[4713]: E0308 00:09:33.473394 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:33.973365026 +0000 UTC m=+228.092997259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.473521 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:33 crc kubenswrapper[4713]: E0308 00:09:33.473892 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:33.973879419 +0000 UTC m=+228.093511652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.657450 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:33 crc kubenswrapper[4713]: E0308 00:09:33.657791 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:34.157772418 +0000 UTC m=+228.277404651 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.759363 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:33 crc kubenswrapper[4713]: E0308 00:09:33.759626 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:34.259614989 +0000 UTC m=+228.379247222 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.860895 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:33 crc kubenswrapper[4713]: E0308 00:09:33.861072 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:34.361045206 +0000 UTC m=+228.480677439 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.861397 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:33 crc kubenswrapper[4713]: E0308 00:09:33.861668 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:34.361659332 +0000 UTC m=+228.481291565 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.915944 4713 patch_prober.go:28] interesting pod/console-operator-58897d9998-2k6nd container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.915998 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-2k6nd" podUID="00793875-21cf-4a6e-8da2-2d94bd3725c4" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.962166 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:33 crc kubenswrapper[4713]: E0308 00:09:33.962337 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:34.462319411 +0000 UTC m=+228.581951644 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.962423 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:33 crc kubenswrapper[4713]: E0308 00:09:33.963210 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:34.463192763 +0000 UTC m=+228.582824996 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.063039 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:34 crc kubenswrapper[4713]: E0308 00:09:34.063233 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:34.563203206 +0000 UTC m=+228.682835439 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.063331 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:34 crc kubenswrapper[4713]: E0308 00:09:34.063618 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:34.563604616 +0000 UTC m=+228.683236849 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.164015 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:34 crc kubenswrapper[4713]: E0308 00:09:34.164266 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:34.664231384 +0000 UTC m=+228.783863617 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.164400 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:34 crc kubenswrapper[4713]: E0308 00:09:34.164885 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:34.664871281 +0000 UTC m=+228.784503514 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.265302 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:34 crc kubenswrapper[4713]: E0308 00:09:34.265493 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:34.765463588 +0000 UTC m=+228.885095821 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.265621 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:34 crc kubenswrapper[4713]: E0308 00:09:34.266056 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:34.766040363 +0000 UTC m=+228.885672596 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.366111 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:34 crc kubenswrapper[4713]: E0308 00:09:34.366398 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:34.866380734 +0000 UTC m=+228.986012967 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.478461 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:34 crc kubenswrapper[4713]: E0308 00:09:34.479276 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:34.97925405 +0000 UTC m=+229.098886283 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.500308 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.500361 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.579795 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:34 crc kubenswrapper[4713]: E0308 00:09:34.580361 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:35.08034195 +0000 UTC m=+229.199974193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.656309 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.656410 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.657644 4713 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-l464l container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.11:8443/livez\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.657684 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" podUID="c61cbc0b-441e-4704-accf-35963b3758aa" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.11:8443/livez\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.681509 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:34 crc kubenswrapper[4713]: E0308 00:09:34.682757 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:35.182740343 +0000 UTC m=+229.302372576 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.782900 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:34 crc kubenswrapper[4713]: E0308 00:09:34.783014 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:35.282996472 +0000 UTC m=+229.402628705 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.783398 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:34 crc kubenswrapper[4713]: E0308 00:09:34.783760 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:35.283749341 +0000 UTC m=+229.403381574 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.885178 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:34 crc kubenswrapper[4713]: E0308 00:09:34.885913 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:35.385892647 +0000 UTC m=+229.505524880 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.932958 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" event={"ID":"496a4fbf-c338-4b64-96a5-dda456094c28","Type":"ContainerStarted","Data":"cb11a6658b39cb703d8113bf5a062563b52b88c1bbd96ee7254651b3846fcc57"} Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.941159 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2qwgb" event={"ID":"9fed4c23-4a16-4502-87eb-d1dd68aa1af5","Type":"ContainerStarted","Data":"2a68097e188634237fb4d5e58d360c20797f8f0410061c29d2759430b638f631"} Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.943387 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl" event={"ID":"cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0","Type":"ContainerStarted","Data":"03c209db335e58ea5662b7255481b43b8d7ba579b7f2816ef681de60076745f6"} Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.946421 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t" event={"ID":"2be1cb07-55b6-4220-989e-13415c3156b2","Type":"ContainerStarted","Data":"b9ed1e36977e077482671111c31c7d2ed9d272672f4b5cc953db2d76ad581370"} Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.948569 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" event={"ID":"9e570b68-8b4c-42e3-839d-f37943999246","Type":"ContainerStarted","Data":"fd9a48944f15c013216b1e59cc31e3539b1ac73b38b0051a0a81749066e50d41"} Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.950119 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt" event={"ID":"d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4","Type":"ContainerStarted","Data":"a517b5241ccbf241e1f4fe7609545a13698dd49b10242725eaeb8822a82084d8"} Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.957127 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-c4nq5" event={"ID":"ee63f184-4609-43d4-bdc1-2c840aef6d7f","Type":"ContainerStarted","Data":"a8b4209283dacd63ee8a200d4e5a6a96337e44c09b55c6b835f3ad418c0ad093"} Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.960141 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.960243 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.960293 4713 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-8m94r container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.960332 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" podUID="0d2f415a-2626-45f9-baf0-68ab25b9d079" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.961397 4713 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-bn56j container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.961445 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" podUID="5eb834dd-5358-45c4-bbca-50baf0e8656b" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.977870 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" podStartSLOduration=156.977845528 podStartE2EDuration="2m36.977845528s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:34.973440387 +0000 UTC m=+229.093072620" watchObservedRunningTime="2026-03-08 00:09:34.977845528 +0000 UTC m=+229.097477771" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.002451 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:35 crc kubenswrapper[4713]: E0308 00:09:35.002752 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:35.502736323 +0000 UTC m=+229.622368556 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.150567 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:35 crc kubenswrapper[4713]: E0308 00:09:35.150970 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:35.650952598 +0000 UTC m=+229.770584831 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.306208 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:35 crc kubenswrapper[4713]: E0308 00:09:35.306589 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:35.806569538 +0000 UTC m=+229.926201841 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.313593 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw" podStartSLOduration=157.313576854 podStartE2EDuration="2m37.313576854s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:34.996862336 +0000 UTC m=+229.116494569" watchObservedRunningTime="2026-03-08 00:09:35.313576854 +0000 UTC m=+229.433209087" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.314713 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq" podStartSLOduration=157.314705592 podStartE2EDuration="2m37.314705592s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:35.312513967 +0000 UTC m=+229.432146200" watchObservedRunningTime="2026-03-08 00:09:35.314705592 +0000 UTC m=+229.434337825" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.320799 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.338279 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" podStartSLOduration=157.338256914 podStartE2EDuration="2m37.338256914s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:35.337731861 +0000 UTC m=+229.457364094" watchObservedRunningTime="2026-03-08 00:09:35.338256914 +0000 UTC m=+229.457889157" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.356102 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" podStartSLOduration=157.356083472 podStartE2EDuration="2m37.356083472s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:35.355022755 +0000 UTC m=+229.474654988" watchObservedRunningTime="2026-03-08 00:09:35.356083472 +0000 UTC m=+229.475715705" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.376952 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-sxbdk" podStartSLOduration=7.376931126 podStartE2EDuration="7.376931126s" podCreationTimestamp="2026-03-08 00:09:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:35.374903075 +0000 UTC m=+229.494535318" watchObservedRunningTime="2026-03-08 00:09:35.376931126 +0000 UTC m=+229.496563389" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.397874 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-drs4q" podStartSLOduration=157.397851841 podStartE2EDuration="2m37.397851841s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:35.394139298 +0000 UTC m=+229.513771531" watchObservedRunningTime="2026-03-08 00:09:35.397851841 +0000 UTC m=+229.517484074" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.407008 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:35 crc kubenswrapper[4713]: E0308 00:09:35.408514 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:35.908491429 +0000 UTC m=+230.028123662 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.420813 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-58c66" podStartSLOduration=158.420797378 podStartE2EDuration="2m38.420797378s" podCreationTimestamp="2026-03-08 00:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:35.420154222 +0000 UTC m=+229.539786455" watchObservedRunningTime="2026-03-08 00:09:35.420797378 +0000 UTC m=+229.540429611" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.508649 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:35 crc kubenswrapper[4713]: E0308 00:09:35.508989 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:36.008971613 +0000 UTC m=+230.128603846 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.609912 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:35 crc kubenswrapper[4713]: E0308 00:09:35.610288 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:36.110271639 +0000 UTC m=+230.229903872 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.711808 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:35 crc kubenswrapper[4713]: E0308 00:09:35.712190 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:36.212172439 +0000 UTC m=+230.331804672 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.813367 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:35 crc kubenswrapper[4713]: E0308 00:09:35.813551 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:36.313503395 +0000 UTC m=+230.433135628 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.813610 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:35 crc kubenswrapper[4713]: E0308 00:09:35.813989 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:36.313972687 +0000 UTC m=+230.433604910 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.887297 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.887608 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.887649 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.915033 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:35 crc kubenswrapper[4713]: E0308 00:09:35.915312 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:36.415297693 +0000 UTC m=+230.534929926 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.964247 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-xr24g" event={"ID":"ccf0e825-0465-40ae-b0ca-f4f7c377e518","Type":"ContainerStarted","Data":"b1bd8cefe222cc7b85756393bbccec0bebade9d8bd0e8902a6b8e0a194d2fc57"} Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.965691 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xmjhj" event={"ID":"158ba4b3-9da3-4a83-95dd-e625c7b19a2b","Type":"ContainerStarted","Data":"ddfcb2d55f56fcd69cf955f63872c49317a99abe32c31680854a4c6388206952"} Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.967103 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" event={"ID":"c9df8d9c-b59f-4a1c-9fb4-668123290569","Type":"ContainerStarted","Data":"6182e807253ba09b176be3aa1eed3d59dbf32b0a321c8119cab78468705d4a0d"} Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.967305 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.968502 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" event={"ID":"2a04a017-1594-43d7-a796-8c676b28095e","Type":"ContainerStarted","Data":"c8ec75cd7a186f4467889f8e0fcfe9eae850fd7f8f43899ce233be5db2fb4c2c"} Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.968804 4713 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-c8gbn container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.29:6443/healthz\": dial tcp 10.217.0.29:6443: connect: connection refused" start-of-body= Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.968850 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" podUID="c9df8d9c-b59f-4a1c-9fb4-668123290569" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.29:6443/healthz\": dial tcp 10.217.0.29:6443: connect: connection refused" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.970169 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lwhnh" event={"ID":"39da2ba4-aebb-485b-8e46-7ffc36efa490","Type":"ContainerStarted","Data":"7c7edf766cc4bfbce05c51380f357c719f0be9f041874a17dca5fed8d540a66e"} Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.972024 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wld5v" event={"ID":"fd936d68-81ed-4923-8078-5ad0116d532e","Type":"ContainerStarted","Data":"875521de81715b88c169372fab2ed2cb0adebaeaadaacf944e8db61b0f28cd19"} Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.973289 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529" event={"ID":"0dbf7b38-8980-49e5-956c-08e443912846","Type":"ContainerStarted","Data":"fa32a54cb695b8a35913b6b0e2a5406f92837e60651424b5ca87b3e7dc75adff"} Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.975331 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5bltg" event={"ID":"899ec382-6c79-460e-9e3c-9dfb25867855","Type":"ContainerStarted","Data":"c08b2fb485dc1ec5c4dcc92d157f7f830eab40b020da577c868ec0e26f18d3e1"} Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.977018 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" event={"ID":"0e43994e-0aa1-4541-bce9-502bbc1dc0a0","Type":"ContainerStarted","Data":"d51e6fd41ac9c40899b923adfbe32076a9b6cc968bf920ed04170b7bfe90da00"} Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.981288 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-xmjhj" podStartSLOduration=7.9812748110000005 podStartE2EDuration="7.981274811s" podCreationTimestamp="2026-03-08 00:09:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:35.979450645 +0000 UTC m=+230.099082888" watchObservedRunningTime="2026-03-08 00:09:35.981274811 +0000 UTC m=+230.100907044" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.984131 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-shncx" event={"ID":"6e21b584-0781-4fa9-8811-332d42755c17","Type":"ContainerStarted","Data":"fff4729dff8af17b584d22a6436f22684389579715022ba86586f8cca9f4618d"} Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.986715 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6swxn" event={"ID":"3a74e1e8-3928-4220-b55d-ee42585ef1ee","Type":"ContainerStarted","Data":"3e011924e2fe3315854a0f3623269b9572c982571676ed5a2133605ddc8f6b2e"} Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.989032 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wd77" event={"ID":"f878574f-5b4a-4a3f-9b2b-e8eeb569f0fc","Type":"ContainerStarted","Data":"e379738a4bef0a60ed14f3cf8d8a3c30d4a82ab1f64b9b1d40ccc937816c8a85"} Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.990374 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" event={"ID":"c9f8ace1-247f-4128-b3f7-95037fb1a156","Type":"ContainerStarted","Data":"75d05f92fb5abe52844bbae56dec71015f076443bb20f74c35d27309150cfd58"} Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.991862 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" event={"ID":"3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6","Type":"ContainerStarted","Data":"1c7678d5dbfcf2643ccdb86b5564eb19a218ea616ec11db244e26dbed403cb0b"} Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.992082 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.993501 4713 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-g99pk container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" start-of-body= Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.993540 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" podUID="3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.993792 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" event={"ID":"d3811a82-b0fe-4e06-948a-79cbbc840a98","Type":"ContainerStarted","Data":"2f71a72df5cc338370ced373637ec8de9d7b684f577930629be009740cd59848"} Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.995638 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" event={"ID":"452f8fcb-d31f-41d4-be85-d041d7efc756","Type":"ContainerStarted","Data":"65470464808bdda97e1a5591cd4693db924ffd2ec404d34cc73a8e884cacae00"} Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.997477 4713 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-8m94r container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.997510 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" podUID="0d2f415a-2626-45f9-baf0-68ab25b9d079" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.997587 4713 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-bn56j container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.997601 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" podUID="5eb834dd-5358-45c4-bbca-50baf0e8656b" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.006944 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" podStartSLOduration=159.006923755 podStartE2EDuration="2m39.006923755s" podCreationTimestamp="2026-03-08 00:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:36.004503595 +0000 UTC m=+230.124135828" watchObservedRunningTime="2026-03-08 00:09:36.006923755 +0000 UTC m=+230.126555988" Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.016503 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:36 crc kubenswrapper[4713]: E0308 00:09:36.017004 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:36.516976788 +0000 UTC m=+230.636609011 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.050717 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5bltg" podStartSLOduration=158.050677535 podStartE2EDuration="2m38.050677535s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:36.025295857 +0000 UTC m=+230.144928090" watchObservedRunningTime="2026-03-08 00:09:36.050677535 +0000 UTC m=+230.170309768" Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.077122 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" podStartSLOduration=159.077107609 podStartE2EDuration="2m39.077107609s" podCreationTimestamp="2026-03-08 00:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:36.076202186 +0000 UTC m=+230.195834419" watchObservedRunningTime="2026-03-08 00:09:36.077107609 +0000 UTC m=+230.196739832" Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.086109 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529" podStartSLOduration=158.086035233 podStartE2EDuration="2m38.086035233s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:36.053942437 +0000 UTC m=+230.173574670" watchObservedRunningTime="2026-03-08 00:09:36.086035233 +0000 UTC m=+230.205667466" Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.106130 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" podStartSLOduration=158.106106008 podStartE2EDuration="2m38.106106008s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:36.105342518 +0000 UTC m=+230.224974751" watchObservedRunningTime="2026-03-08 00:09:36.106106008 +0000 UTC m=+230.225738261" Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.117258 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:36 crc kubenswrapper[4713]: E0308 00:09:36.120525 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:36.620505759 +0000 UTC m=+230.740137992 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.124545 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-shncx" podStartSLOduration=158.12452203 podStartE2EDuration="2m38.12452203s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:36.122174021 +0000 UTC m=+230.241806264" watchObservedRunningTime="2026-03-08 00:09:36.12452203 +0000 UTC m=+230.244154263" Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.139025 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" podStartSLOduration=158.139004024 podStartE2EDuration="2m38.139004024s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:36.138360018 +0000 UTC m=+230.257992261" watchObservedRunningTime="2026-03-08 00:09:36.139004024 +0000 UTC m=+230.258636257" Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.161771 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t" podStartSLOduration=158.161752426 podStartE2EDuration="2m38.161752426s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:36.161561281 +0000 UTC m=+230.281193514" watchObservedRunningTime="2026-03-08 00:09:36.161752426 +0000 UTC m=+230.281384669" Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.178929 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wd77" podStartSLOduration=158.178909887 podStartE2EDuration="2m38.178909887s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:36.177797279 +0000 UTC m=+230.297429542" watchObservedRunningTime="2026-03-08 00:09:36.178909887 +0000 UTC m=+230.298542120" Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.192985 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl" podStartSLOduration=158.19296447 podStartE2EDuration="2m38.19296447s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:36.192733974 +0000 UTC m=+230.312366207" watchObservedRunningTime="2026-03-08 00:09:36.19296447 +0000 UTC m=+230.312596703" Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.214028 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" podStartSLOduration=158.214008319 podStartE2EDuration="2m38.214008319s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:36.212091341 +0000 UTC m=+230.331723584" watchObservedRunningTime="2026-03-08 00:09:36.214008319 +0000 UTC m=+230.333640552" Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.224083 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:36 crc kubenswrapper[4713]: E0308 00:09:36.224461 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:36.724448241 +0000 UTC m=+230.844080464 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.237434 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-c4nq5" podStartSLOduration=158.237410357 podStartE2EDuration="2m38.237410357s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:36.235722814 +0000 UTC m=+230.355355067" watchObservedRunningTime="2026-03-08 00:09:36.237410357 +0000 UTC m=+230.357042590" Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.320334 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.325127 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:36 crc kubenswrapper[4713]: E0308 00:09:36.325252 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:36.825232994 +0000 UTC m=+230.944865237 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.325491 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:36 crc kubenswrapper[4713]: E0308 00:09:36.325933 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:36.825915901 +0000 UTC m=+230.945548134 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.427200 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:36 crc kubenswrapper[4713]: E0308 00:09:36.427588 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:36.927569705 +0000 UTC m=+231.047201938 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.528503 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:36 crc kubenswrapper[4713]: E0308 00:09:36.529045 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:37.029027624 +0000 UTC m=+231.148659927 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.630199 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:36 crc kubenswrapper[4713]: E0308 00:09:36.630540 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:37.130523605 +0000 UTC m=+231.250155838 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.731322 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:36 crc kubenswrapper[4713]: E0308 00:09:36.733092 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:37.233071561 +0000 UTC m=+231.352703844 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.833068 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:36 crc kubenswrapper[4713]: E0308 00:09:36.833413 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:37.333387652 +0000 UTC m=+231.453019885 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.888511 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.888580 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.935064 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:36 crc kubenswrapper[4713]: E0308 00:09:36.935498 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:37.435483747 +0000 UTC m=+231.555115980 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.010697 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6swxn" event={"ID":"3a74e1e8-3928-4220-b55d-ee42585ef1ee","Type":"ContainerStarted","Data":"272378b14ead6b0fe8f70c4a69a4e8e415883406601525810e82923d770b8d6f"} Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.013253 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt" event={"ID":"d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4","Type":"ContainerStarted","Data":"ecacfcd803dd5b2c9f84eccc1b8c3ca6289b45f4bf70a11e20eed2588dfed870"} Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.013390 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt" Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.021692 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lwhnh" event={"ID":"39da2ba4-aebb-485b-8e46-7ffc36efa490","Type":"ContainerStarted","Data":"063872b729e3f32ff6b60124c486f029cdd9345a46169aea92b871431d458350"} Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.023503 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" event={"ID":"d3811a82-b0fe-4e06-948a-79cbbc840a98","Type":"ContainerStarted","Data":"1544e4356d60f3367c60a34a2f6fa643b4ceb544c5db56eca24d5dd1b21d7db2"} Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.027249 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" event={"ID":"496a4fbf-c338-4b64-96a5-dda456094c28","Type":"ContainerStarted","Data":"a8dce851bd245a5dbc4a99d4117015c1cf2fed3bca5c996d3702ed4d45852654"} Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.031184 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2qwgb" event={"ID":"9fed4c23-4a16-4502-87eb-d1dd68aa1af5","Type":"ContainerStarted","Data":"b29a6383591408b33e46513d52dd44a7999f4a23ae697a854adb2bf157892504"} Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.031813 4713 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-c8gbn container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.29:6443/healthz\": dial tcp 10.217.0.29:6443: connect: connection refused" start-of-body= Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.031873 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" podUID="c9df8d9c-b59f-4a1c-9fb4-668123290569" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.29:6443/healthz\": dial tcp 10.217.0.29:6443: connect: connection refused" Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.032163 4713 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-g99pk container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" start-of-body= Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.032193 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" podUID="3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.033344 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt" podStartSLOduration=159.033331356 podStartE2EDuration="2m39.033331356s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:37.030867044 +0000 UTC m=+231.150499307" watchObservedRunningTime="2026-03-08 00:09:37.033331356 +0000 UTC m=+231.152963589" Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.035697 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.036291 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:37.53627557 +0000 UTC m=+231.655907803 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.047013 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" podStartSLOduration=159.046992919 podStartE2EDuration="2m39.046992919s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:37.045529562 +0000 UTC m=+231.165161795" watchObservedRunningTime="2026-03-08 00:09:37.046992919 +0000 UTC m=+231.166625152" Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.097145 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" podStartSLOduration=159.097125079 podStartE2EDuration="2m39.097125079s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:37.078585453 +0000 UTC m=+231.198217696" watchObservedRunningTime="2026-03-08 00:09:37.097125079 +0000 UTC m=+231.216757322" Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.118488 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-2qwgb" podStartSLOduration=159.118472095 podStartE2EDuration="2m39.118472095s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:37.098300828 +0000 UTC m=+231.217933061" watchObservedRunningTime="2026-03-08 00:09:37.118472095 +0000 UTC m=+231.238104328" Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.120606 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" podStartSLOduration=159.120596198 podStartE2EDuration="2m39.120596198s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:37.117809048 +0000 UTC m=+231.237441281" watchObservedRunningTime="2026-03-08 00:09:37.120596198 +0000 UTC m=+231.240228431" Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.137748 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.142747 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:37.642730435 +0000 UTC m=+231.762362758 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.152325 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" podStartSLOduration=160.152304895 podStartE2EDuration="2m40.152304895s" podCreationTimestamp="2026-03-08 00:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:37.137011201 +0000 UTC m=+231.256643454" watchObservedRunningTime="2026-03-08 00:09:37.152304895 +0000 UTC m=+231.271937128" Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.155283 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-xr24g" podStartSLOduration=159.15527509 podStartE2EDuration="2m39.15527509s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:37.151957017 +0000 UTC m=+231.271589260" watchObservedRunningTime="2026-03-08 00:09:37.15527509 +0000 UTC m=+231.274907333" Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.169535 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wld5v" podStartSLOduration=159.169518798 podStartE2EDuration="2m39.169518798s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:37.167241471 +0000 UTC m=+231.286873724" watchObservedRunningTime="2026-03-08 00:09:37.169518798 +0000 UTC m=+231.289151031" Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.239412 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.239612 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:37.739579708 +0000 UTC m=+231.859211931 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.239676 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.240023 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:37.740015369 +0000 UTC m=+231.859647602 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.341051 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.341238 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:37.841220092 +0000 UTC m=+231.960852325 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.341357 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.341651 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:37.841644093 +0000 UTC m=+231.961276316 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.442486 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.442610 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:37.942592758 +0000 UTC m=+232.062224991 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.442762 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.443053 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:37.94304166 +0000 UTC m=+232.062673883 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.543626 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.543762 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.04374479 +0000 UTC m=+232.163377023 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.543891 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.544146 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.04413914 +0000 UTC m=+232.163771373 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.644740 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.644943 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.144915252 +0000 UTC m=+232.264547485 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.645200 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.645535 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.145527387 +0000 UTC m=+232.265159620 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.746032 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.746200 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.246170456 +0000 UTC m=+232.365802689 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.746454 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.746772 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.246759581 +0000 UTC m=+232.366391814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.848087 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.848297 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.348264022 +0000 UTC m=+232.467896255 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.848376 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.848688 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.348680652 +0000 UTC m=+232.468312875 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.888182 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.888273 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.949196 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.949354 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.449320821 +0000 UTC m=+232.568953054 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.949538 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.949842 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.449816593 +0000 UTC m=+232.569448886 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.043099 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-lwhnh" Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.043254 4713 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-k5mg9 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.043505 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" podUID="452f8fcb-d31f-41d4-be85-d041d7efc756" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.051785 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:38 crc kubenswrapper[4713]: E0308 00:09:38.051978 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.55195685 +0000 UTC m=+232.671589083 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.052472 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:38 crc kubenswrapper[4713]: E0308 00:09:38.054788 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.55477065 +0000 UTC m=+232.674402883 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.088275 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-lwhnh" podStartSLOduration=11.088254732 podStartE2EDuration="11.088254732s" podCreationTimestamp="2026-03-08 00:09:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:38.068468035 +0000 UTC m=+232.188100288" watchObservedRunningTime="2026-03-08 00:09:38.088254732 +0000 UTC m=+232.207886975" Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.089904 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6swxn" podStartSLOduration=160.089895033 podStartE2EDuration="2m40.089895033s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:38.086491947 +0000 UTC m=+232.206124190" watchObservedRunningTime="2026-03-08 00:09:38.089895033 +0000 UTC m=+232.209527266" Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.153540 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:38 crc kubenswrapper[4713]: E0308 00:09:38.153806 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.653775588 +0000 UTC m=+232.773407821 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.154087 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:38 crc kubenswrapper[4713]: E0308 00:09:38.155079 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.65506717 +0000 UTC m=+232.774699403 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.255228 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:38 crc kubenswrapper[4713]: E0308 00:09:38.255393 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.755367221 +0000 UTC m=+232.874999454 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.255482 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:38 crc kubenswrapper[4713]: E0308 00:09:38.255919 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.755907394 +0000 UTC m=+232.875539627 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.356885 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:38 crc kubenswrapper[4713]: E0308 00:09:38.357077 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.857049466 +0000 UTC m=+232.976681699 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.357150 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:38 crc kubenswrapper[4713]: E0308 00:09:38.357482 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.857472006 +0000 UTC m=+232.977104309 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.457849 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:38 crc kubenswrapper[4713]: E0308 00:09:38.458183 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.958166126 +0000 UTC m=+233.077798349 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.559101 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:38 crc kubenswrapper[4713]: E0308 00:09:38.559459 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:39.059445611 +0000 UTC m=+233.179077844 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.660145 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:38 crc kubenswrapper[4713]: E0308 00:09:38.660546 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:39.160528141 +0000 UTC m=+233.280160374 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.761691 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:38 crc kubenswrapper[4713]: E0308 00:09:38.762045 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:39.262029141 +0000 UTC m=+233.381661374 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.862879 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:38 crc kubenswrapper[4713]: E0308 00:09:38.863001 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:39.362983668 +0000 UTC m=+233.482615901 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.863080 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:38 crc kubenswrapper[4713]: E0308 00:09:38.863377 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:39.363369148 +0000 UTC m=+233.483001381 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.888701 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:09:38 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 08 00:09:38 crc kubenswrapper[4713]: [+]process-running ok Mar 08 00:09:38 crc kubenswrapper[4713]: healthz check failed Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.888774 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.963718 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:38 crc kubenswrapper[4713]: E0308 00:09:38.964097 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:39.464081898 +0000 UTC m=+233.583714131 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.065042 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:39 crc kubenswrapper[4713]: E0308 00:09:39.065368 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:39.565357363 +0000 UTC m=+233.684989596 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.166698 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:39 crc kubenswrapper[4713]: E0308 00:09:39.167423 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:39.667402857 +0000 UTC m=+233.787035090 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.268894 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:39 crc kubenswrapper[4713]: E0308 00:09:39.269308 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:39.769289877 +0000 UTC m=+233.888922110 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.321192 4713 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-k5mg9 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.321192 4713 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-k5mg9 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.321242 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" podUID="452f8fcb-d31f-41d4-be85-d041d7efc756" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.321305 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" podUID="452f8fcb-d31f-41d4-be85-d041d7efc756" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.371705 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:39 crc kubenswrapper[4713]: E0308 00:09:39.372130 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:39.872111311 +0000 UTC m=+233.991743544 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.473302 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:39 crc kubenswrapper[4713]: E0308 00:09:39.473677 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:39.973662433 +0000 UTC m=+234.093294666 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.574633 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:39 crc kubenswrapper[4713]: E0308 00:09:39.574801 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:40.074783214 +0000 UTC m=+234.194415447 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.574914 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:39 crc kubenswrapper[4713]: E0308 00:09:39.575187 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:40.075180844 +0000 UTC m=+234.194813077 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.671136 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.675440 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:39 crc kubenswrapper[4713]: E0308 00:09:39.675554 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:40.175530495 +0000 UTC m=+234.295162728 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:39 crc kubenswrapper[4713]: E0308 00:09:39.676125 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:40.17611752 +0000 UTC m=+234.295749753 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.675895 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.682095 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.767868 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.768609 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.777280 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:39 crc kubenswrapper[4713]: E0308 00:09:39.777505 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:40.277472906 +0000 UTC m=+234.397105139 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.777836 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:39 crc kubenswrapper[4713]: E0308 00:09:39.778335 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:40.278313018 +0000 UTC m=+234.397945251 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.830916 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.879412 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:39 crc kubenswrapper[4713]: E0308 00:09:39.880404 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:40.379591062 +0000 UTC m=+234.499223295 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.880523 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:39 crc kubenswrapper[4713]: E0308 00:09:39.880930 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:40.380919636 +0000 UTC m=+234.500551869 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.898620 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:09:39 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 08 00:09:39 crc kubenswrapper[4713]: [+]process-running ok Mar 08 00:09:39 crc kubenswrapper[4713]: healthz check failed Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.898670 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.957520 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.981752 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:39 crc kubenswrapper[4713]: E0308 00:09:39.982868 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:40.482835867 +0000 UTC m=+234.602468100 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.071331 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-q84x9" event={"ID":"063a79dd-fbe8-4562-98bc-deb309b25182","Type":"ContainerStarted","Data":"7470769f57edb813356a2be9d5379cbabe535bd2a0b0a02d545f7198d60d26db"} Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.087190 4713 ???:1] "http: TLS handshake error from 192.168.126.11:51858: no serving certificate available for the kubelet" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.087630 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:40 crc kubenswrapper[4713]: E0308 00:09:40.088729 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:40.588712707 +0000 UTC m=+234.708344940 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.189168 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:40 crc kubenswrapper[4713]: E0308 00:09:40.189387 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:40.689356356 +0000 UTC m=+234.808988589 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.189534 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:40 crc kubenswrapper[4713]: E0308 00:09:40.189956 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:40.689944621 +0000 UTC m=+234.809576854 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.220017 4713 ???:1] "http: TLS handshake error from 192.168.126.11:51870: no serving certificate available for the kubelet" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.291045 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:40 crc kubenswrapper[4713]: E0308 00:09:40.291352 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:40.791336558 +0000 UTC m=+234.910968791 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.301357 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.302178 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.302211 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.303068 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.303570 4713 patch_prober.go:28] interesting pod/console-f9d7485db-gk97q container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.27:8443/health\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.303635 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-gk97q" podUID="1d068555-56f2-4bcf-8b4c-cc574ad087fa" containerName="console" probeResult="failure" output="Get \"https://10.217.0.27:8443/health\": dial tcp 10.217.0.27:8443: connect: connection refused" Mar 08 00:09:40 crc kubenswrapper[4713]: W0308 00:09:40.307009 4713 reflector.go:561] object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n": failed to list *v1.Secret: secrets "installer-sa-dockercfg-5pr6n" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-kube-apiserver": no relationship found between node 'crc' and this object Mar 08 00:09:40 crc kubenswrapper[4713]: E0308 00:09:40.307056 4713 reflector.go:158] "Unhandled Error" err="object-\"openshift-kube-apiserver\"/\"installer-sa-dockercfg-5pr6n\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"installer-sa-dockercfg-5pr6n\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-kube-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 08 00:09:40 crc kubenswrapper[4713]: W0308 00:09:40.307118 4713 reflector.go:561] object-"openshift-kube-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-kube-apiserver": no relationship found between node 'crc' and this object Mar 08 00:09:40 crc kubenswrapper[4713]: E0308 00:09:40.307134 4713 reflector.go:158] "Unhandled Error" err="object-\"openshift-kube-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-kube-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.318200 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.341993 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-z4s84 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.342042 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.342254 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-z4s84 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.342268 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.367024 4713 ???:1] "http: TLS handshake error from 192.168.126.11:51876: no serving certificate available for the kubelet" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.392204 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.392397 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de40fceb-b995-45d6-8272-3a93c1b85bc8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"de40fceb-b995-45d6-8272-3a93c1b85bc8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.392476 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de40fceb-b995-45d6-8272-3a93c1b85bc8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"de40fceb-b995-45d6-8272-3a93c1b85bc8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 00:09:40 crc kubenswrapper[4713]: E0308 00:09:40.393523 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:40.893511356 +0000 UTC m=+235.013143589 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.397885 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-2k6nd" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.455629 4713 ???:1] "http: TLS handshake error from 192.168.126.11:51890: no serving certificate available for the kubelet" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.493200 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:40 crc kubenswrapper[4713]: E0308 00:09:40.494074 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:40.993371055 +0000 UTC m=+235.113003288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.494156 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.494288 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de40fceb-b995-45d6-8272-3a93c1b85bc8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"de40fceb-b995-45d6-8272-3a93c1b85bc8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.494417 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de40fceb-b995-45d6-8272-3a93c1b85bc8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"de40fceb-b995-45d6-8272-3a93c1b85bc8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 00:09:40 crc kubenswrapper[4713]: E0308 00:09:40.494910 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:40.994902243 +0000 UTC m=+235.114534476 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.495047 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de40fceb-b995-45d6-8272-3a93c1b85bc8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"de40fceb-b995-45d6-8272-3a93c1b85bc8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.527188 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x6gcb"] Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.528377 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.532061 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.572139 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x6gcb"] Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.572476 4713 ???:1] "http: TLS handshake error from 192.168.126.11:51906: no serving certificate available for the kubelet" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.597386 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.597508 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9341928-7a63-4190-ac37-ac9ba3320e18-catalog-content\") pod \"certified-operators-x6gcb\" (UID: \"d9341928-7a63-4190-ac37-ac9ba3320e18\") " pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.597650 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prrdn\" (UniqueName: \"kubernetes.io/projected/d9341928-7a63-4190-ac37-ac9ba3320e18-kube-api-access-prrdn\") pod \"certified-operators-x6gcb\" (UID: \"d9341928-7a63-4190-ac37-ac9ba3320e18\") " pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.597684 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9341928-7a63-4190-ac37-ac9ba3320e18-utilities\") pod \"certified-operators-x6gcb\" (UID: \"d9341928-7a63-4190-ac37-ac9ba3320e18\") " pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:09:40 crc kubenswrapper[4713]: E0308 00:09:40.597794 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:41.097777478 +0000 UTC m=+235.217409711 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.628134 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.653033 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.699156 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prrdn\" (UniqueName: \"kubernetes.io/projected/d9341928-7a63-4190-ac37-ac9ba3320e18-kube-api-access-prrdn\") pod \"certified-operators-x6gcb\" (UID: \"d9341928-7a63-4190-ac37-ac9ba3320e18\") " pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.699248 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9341928-7a63-4190-ac37-ac9ba3320e18-utilities\") pod \"certified-operators-x6gcb\" (UID: \"d9341928-7a63-4190-ac37-ac9ba3320e18\") " pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.699289 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9341928-7a63-4190-ac37-ac9ba3320e18-catalog-content\") pod \"certified-operators-x6gcb\" (UID: \"d9341928-7a63-4190-ac37-ac9ba3320e18\") " pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.699316 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:40 crc kubenswrapper[4713]: E0308 00:09:40.699561 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:41.199547125 +0000 UTC m=+235.319179358 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.701639 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9341928-7a63-4190-ac37-ac9ba3320e18-utilities\") pod \"certified-operators-x6gcb\" (UID: \"d9341928-7a63-4190-ac37-ac9ba3320e18\") " pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.702313 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9341928-7a63-4190-ac37-ac9ba3320e18-catalog-content\") pod \"certified-operators-x6gcb\" (UID: \"d9341928-7a63-4190-ac37-ac9ba3320e18\") " pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.723843 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4tj99"] Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.725063 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.730230 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.735661 4713 ???:1] "http: TLS handshake error from 192.168.126.11:51908: no serving certificate available for the kubelet" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.765014 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4tj99"] Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.784858 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prrdn\" (UniqueName: \"kubernetes.io/projected/d9341928-7a63-4190-ac37-ac9ba3320e18-kube-api-access-prrdn\") pod \"certified-operators-x6gcb\" (UID: \"d9341928-7a63-4190-ac37-ac9ba3320e18\") " pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.804454 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.804649 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40864d72-e137-478e-8340-8c0f107b4c60-catalog-content\") pod \"community-operators-4tj99\" (UID: \"40864d72-e137-478e-8340-8c0f107b4c60\") " pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.804694 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40864d72-e137-478e-8340-8c0f107b4c60-utilities\") pod \"community-operators-4tj99\" (UID: \"40864d72-e137-478e-8340-8c0f107b4c60\") " pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.804719 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8fx2\" (UniqueName: \"kubernetes.io/projected/40864d72-e137-478e-8340-8c0f107b4c60-kube-api-access-m8fx2\") pod \"community-operators-4tj99\" (UID: \"40864d72-e137-478e-8340-8c0f107b4c60\") " pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:09:40 crc kubenswrapper[4713]: E0308 00:09:40.804884 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:41.304869082 +0000 UTC m=+235.424501315 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.843925 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.886326 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.889153 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:09:40 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 08 00:09:40 crc kubenswrapper[4713]: [+]process-running ok Mar 08 00:09:40 crc kubenswrapper[4713]: healthz check failed Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.889193 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.905566 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40864d72-e137-478e-8340-8c0f107b4c60-catalog-content\") pod \"community-operators-4tj99\" (UID: \"40864d72-e137-478e-8340-8c0f107b4c60\") " pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.905625 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40864d72-e137-478e-8340-8c0f107b4c60-utilities\") pod \"community-operators-4tj99\" (UID: \"40864d72-e137-478e-8340-8c0f107b4c60\") " pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.905650 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8fx2\" (UniqueName: \"kubernetes.io/projected/40864d72-e137-478e-8340-8c0f107b4c60-kube-api-access-m8fx2\") pod \"community-operators-4tj99\" (UID: \"40864d72-e137-478e-8340-8c0f107b4c60\") " pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.905691 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:40 crc kubenswrapper[4713]: E0308 00:09:40.905970 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:41.405958742 +0000 UTC m=+235.525590975 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.906316 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40864d72-e137-478e-8340-8c0f107b4c60-catalog-content\") pod \"community-operators-4tj99\" (UID: \"40864d72-e137-478e-8340-8c0f107b4c60\") " pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.906526 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40864d72-e137-478e-8340-8c0f107b4c60-utilities\") pod \"community-operators-4tj99\" (UID: \"40864d72-e137-478e-8340-8c0f107b4c60\") " pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.915114 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x7pkf"] Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.915970 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.933008 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.940703 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.946367 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x7pkf"] Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.954527 4713 ???:1] "http: TLS handshake error from 192.168.126.11:51912: no serving certificate available for the kubelet" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.964494 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8fx2\" (UniqueName: \"kubernetes.io/projected/40864d72-e137-478e-8340-8c0f107b4c60-kube-api-access-m8fx2\") pod \"community-operators-4tj99\" (UID: \"40864d72-e137-478e-8340-8c0f107b4c60\") " pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.994900 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.008636 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.008951 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bjqb\" (UniqueName: \"kubernetes.io/projected/c33b42a1-bf95-490f-a907-765855ec81d1-kube-api-access-7bjqb\") pod \"certified-operators-x7pkf\" (UID: \"c33b42a1-bf95-490f-a907-765855ec81d1\") " pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.009022 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c33b42a1-bf95-490f-a907-765855ec81d1-catalog-content\") pod \"certified-operators-x7pkf\" (UID: \"c33b42a1-bf95-490f-a907-765855ec81d1\") " pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.009088 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c33b42a1-bf95-490f-a907-765855ec81d1-utilities\") pod \"certified-operators-x7pkf\" (UID: \"c33b42a1-bf95-490f-a907-765855ec81d1\") " pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:09:41 crc kubenswrapper[4713]: E0308 00:09:41.009186 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:41.509171334 +0000 UTC m=+235.628803567 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.040856 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.059073 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.073154 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.101078 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.101281 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.115934 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c33b42a1-bf95-490f-a907-765855ec81d1-utilities\") pod \"certified-operators-x7pkf\" (UID: \"c33b42a1-bf95-490f-a907-765855ec81d1\") " pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.116024 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.116108 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bjqb\" (UniqueName: \"kubernetes.io/projected/c33b42a1-bf95-490f-a907-765855ec81d1-kube-api-access-7bjqb\") pod \"certified-operators-x7pkf\" (UID: \"c33b42a1-bf95-490f-a907-765855ec81d1\") " pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.119382 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c33b42a1-bf95-490f-a907-765855ec81d1-catalog-content\") pod \"certified-operators-x7pkf\" (UID: \"c33b42a1-bf95-490f-a907-765855ec81d1\") " pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.119907 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c33b42a1-bf95-490f-a907-765855ec81d1-catalog-content\") pod \"certified-operators-x7pkf\" (UID: \"c33b42a1-bf95-490f-a907-765855ec81d1\") " pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.124513 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c33b42a1-bf95-490f-a907-765855ec81d1-utilities\") pod \"certified-operators-x7pkf\" (UID: \"c33b42a1-bf95-490f-a907-765855ec81d1\") " pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:09:41 crc kubenswrapper[4713]: E0308 00:09:41.124775 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:41.624762279 +0000 UTC m=+235.744394512 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.125637 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.138468 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pd9br"] Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.150203 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.174929 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pd9br"] Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.186413 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bjqb\" (UniqueName: \"kubernetes.io/projected/c33b42a1-bf95-490f-a907-765855ec81d1-kube-api-access-7bjqb\") pod \"certified-operators-x7pkf\" (UID: \"c33b42a1-bf95-490f-a907-765855ec81d1\") " pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.221414 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.221712 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/64aa73b3-797b-405e-b2ca-db772f204659-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"64aa73b3-797b-405e-b2ca-db772f204659\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.221776 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t4bc\" (UniqueName: \"kubernetes.io/projected/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-kube-api-access-9t4bc\") pod \"community-operators-pd9br\" (UID: \"cd4a956b-6edb-436e-bd5e-5d57899c2ea1\") " pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.221848 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-utilities\") pod \"community-operators-pd9br\" (UID: \"cd4a956b-6edb-436e-bd5e-5d57899c2ea1\") " pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.221881 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-catalog-content\") pod \"community-operators-pd9br\" (UID: \"cd4a956b-6edb-436e-bd5e-5d57899c2ea1\") " pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.221922 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/64aa73b3-797b-405e-b2ca-db772f204659-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"64aa73b3-797b-405e-b2ca-db772f204659\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 00:09:41 crc kubenswrapper[4713]: E0308 00:09:41.222097 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:41.722081754 +0000 UTC m=+235.841713987 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.238189 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.324230 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-utilities\") pod \"community-operators-pd9br\" (UID: \"cd4a956b-6edb-436e-bd5e-5d57899c2ea1\") " pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.324280 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-catalog-content\") pod \"community-operators-pd9br\" (UID: \"cd4a956b-6edb-436e-bd5e-5d57899c2ea1\") " pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.324327 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/64aa73b3-797b-405e-b2ca-db772f204659-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"64aa73b3-797b-405e-b2ca-db772f204659\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.324350 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.324370 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/64aa73b3-797b-405e-b2ca-db772f204659-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"64aa73b3-797b-405e-b2ca-db772f204659\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.324396 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t4bc\" (UniqueName: \"kubernetes.io/projected/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-kube-api-access-9t4bc\") pod \"community-operators-pd9br\" (UID: \"cd4a956b-6edb-436e-bd5e-5d57899c2ea1\") " pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.325008 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-utilities\") pod \"community-operators-pd9br\" (UID: \"cd4a956b-6edb-436e-bd5e-5d57899c2ea1\") " pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.325219 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-catalog-content\") pod \"community-operators-pd9br\" (UID: \"cd4a956b-6edb-436e-bd5e-5d57899c2ea1\") " pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:09:41 crc kubenswrapper[4713]: E0308 00:09:41.325553 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:41.825543194 +0000 UTC m=+235.945175427 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.325678 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/64aa73b3-797b-405e-b2ca-db772f204659-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"64aa73b3-797b-405e-b2ca-db772f204659\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.350767 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t4bc\" (UniqueName: \"kubernetes.io/projected/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-kube-api-access-9t4bc\") pod \"community-operators-pd9br\" (UID: \"cd4a956b-6edb-436e-bd5e-5d57899c2ea1\") " pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.363816 4713 ???:1] "http: TLS handshake error from 192.168.126.11:51926: no serving certificate available for the kubelet" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.372129 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/64aa73b3-797b-405e-b2ca-db772f204659-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"64aa73b3-797b-405e-b2ca-db772f204659\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.405211 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.416394 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de40fceb-b995-45d6-8272-3a93c1b85bc8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"de40fceb-b995-45d6-8272-3a93c1b85bc8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.427625 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:41 crc kubenswrapper[4713]: E0308 00:09:41.428043 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:41.928027899 +0000 UTC m=+236.047660132 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.443715 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x6gcb"] Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.461500 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.470016 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 00:09:41 crc kubenswrapper[4713]: W0308 00:09:41.471264 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9341928_7a63_4190_ac37_ac9ba3320e18.slice/crio-8da0f0760030352f0e71a9d8d27a1069de63fe3b39a327ba9c1b618d352e4f81 WatchSource:0}: Error finding container 8da0f0760030352f0e71a9d8d27a1069de63fe3b39a327ba9c1b618d352e4f81: Status 404 returned error can't find the container with id 8da0f0760030352f0e71a9d8d27a1069de63fe3b39a327ba9c1b618d352e4f81 Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.484111 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.501528 4713 patch_prober.go:28] interesting pod/apiserver-76f77b778f-58c66 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 08 00:09:41 crc kubenswrapper[4713]: [+]log ok Mar 08 00:09:41 crc kubenswrapper[4713]: [+]etcd ok Mar 08 00:09:41 crc kubenswrapper[4713]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 08 00:09:41 crc kubenswrapper[4713]: [+]poststarthook/generic-apiserver-start-informers ok Mar 08 00:09:41 crc kubenswrapper[4713]: [+]poststarthook/max-in-flight-filter ok Mar 08 00:09:41 crc kubenswrapper[4713]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 08 00:09:41 crc kubenswrapper[4713]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 08 00:09:41 crc kubenswrapper[4713]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 08 00:09:41 crc kubenswrapper[4713]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 08 00:09:41 crc kubenswrapper[4713]: [+]poststarthook/project.openshift.io-projectcache ok Mar 08 00:09:41 crc kubenswrapper[4713]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 08 00:09:41 crc kubenswrapper[4713]: [+]poststarthook/openshift.io-startinformers ok Mar 08 00:09:41 crc kubenswrapper[4713]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 08 00:09:41 crc kubenswrapper[4713]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 08 00:09:41 crc kubenswrapper[4713]: livez check failed Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.501589 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-58c66" podUID="bfa92863-23f8-42d4-8e73-433bf546d304" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.529165 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:41 crc kubenswrapper[4713]: E0308 00:09:41.529556 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:42.029540119 +0000 UTC m=+236.149172352 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.631334 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:41 crc kubenswrapper[4713]: E0308 00:09:41.631735 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:42.131712987 +0000 UTC m=+236.251345220 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.694184 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.696967 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.733642 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:41 crc kubenswrapper[4713]: E0308 00:09:41.734223 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:42.234212172 +0000 UTC m=+236.353844405 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.838638 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:41 crc kubenswrapper[4713]: E0308 00:09:41.839001 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:42.338986235 +0000 UTC m=+236.458618468 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.912076 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4tj99"] Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.940684 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:41 crc kubenswrapper[4713]: E0308 00:09:41.941166 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:42.441138672 +0000 UTC m=+236.560770905 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.971988 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:09:41 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 08 00:09:41 crc kubenswrapper[4713]: [+]process-running ok Mar 08 00:09:41 crc kubenswrapper[4713]: healthz check failed Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.972037 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.045168 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:42 crc kubenswrapper[4713]: E0308 00:09:42.045443 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:42.545416382 +0000 UTC m=+236.665048605 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.045626 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:42 crc kubenswrapper[4713]: E0308 00:09:42.046717 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:42.546705784 +0000 UTC m=+236.666338017 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.066360 4713 ???:1] "http: TLS handshake error from 192.168.126.11:51932: no serving certificate available for the kubelet" Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.146865 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pd9br"] Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.147366 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:42 crc kubenswrapper[4713]: E0308 00:09:42.147636 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:42.64762139 +0000 UTC m=+236.767253623 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.167424 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4tj99" event={"ID":"40864d72-e137-478e-8340-8c0f107b4c60","Type":"ContainerStarted","Data":"3cdea3678803ad7453d0a386b7a4a0468a866e4a3767422ad83b05a97ef4bf14"} Mar 08 00:09:42 crc kubenswrapper[4713]: W0308 00:09:42.179254 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd4a956b_6edb_436e_bd5e_5d57899c2ea1.slice/crio-135e656a965d1b87bbb089b3e89dbd03d0497fd3df39d718203e4d15ec7454b9 WatchSource:0}: Error finding container 135e656a965d1b87bbb089b3e89dbd03d0497fd3df39d718203e4d15ec7454b9: Status 404 returned error can't find the container with id 135e656a965d1b87bbb089b3e89dbd03d0497fd3df39d718203e4d15ec7454b9 Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.179839 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6gcb" event={"ID":"d9341928-7a63-4190-ac37-ac9ba3320e18","Type":"ContainerDied","Data":"e4404a3c0caa01e5acd1c3db2a69f4b96b4d1f768431d32a330b55a8351235db"} Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.179705 4713 generic.go:334] "Generic (PLEG): container finished" podID="d9341928-7a63-4190-ac37-ac9ba3320e18" containerID="e4404a3c0caa01e5acd1c3db2a69f4b96b4d1f768431d32a330b55a8351235db" exitCode=0 Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.179900 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6gcb" event={"ID":"d9341928-7a63-4190-ac37-ac9ba3320e18","Type":"ContainerStarted","Data":"8da0f0760030352f0e71a9d8d27a1069de63fe3b39a327ba9c1b618d352e4f81"} Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.197742 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x7pkf"] Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.216125 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.237460 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4xznw"] Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.237720 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" podUID="e4ba1fb6-83e1-4a29-93a5-5abf00f86718" containerName="controller-manager" containerID="cri-o://9536e9b3624c06646894a8bbf0b9ca445d2a94426c01c655b1f4a1a1e29602ba" gracePeriod=30 Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.245049 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7"] Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.245320 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" podUID="c5cc5125-93f0-4709-afbd-7aa6a888b641" containerName="route-controller-manager" containerID="cri-o://a68b4ccfdfbaf91b0589175f60e09a31251dadc4c8962143c6e936d1c65c0638" gracePeriod=30 Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.249383 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:42 crc kubenswrapper[4713]: E0308 00:09:42.249669 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:42.749658024 +0000 UTC m=+236.869290257 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:42 crc kubenswrapper[4713]: W0308 00:09:42.297420 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc33b42a1_bf95_490f_a907_765855ec81d1.slice/crio-8b84966b96c0ed6376bfb58ebe4d50727b2f7c4a888ad1b3e8b431d7574ba8b4 WatchSource:0}: Error finding container 8b84966b96c0ed6376bfb58ebe4d50727b2f7c4a888ad1b3e8b431d7574ba8b4: Status 404 returned error can't find the container with id 8b84966b96c0ed6376bfb58ebe4d50727b2f7c4a888ad1b3e8b431d7574ba8b4 Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.303635 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.325438 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" Mar 08 00:09:42 crc kubenswrapper[4713]: W0308 00:09:42.331088 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podde40fceb_b995_45d6_8272_3a93c1b85bc8.slice/crio-43d63f0f20049184538f35ad824609d60bca169ae23561e9bd2dd8c3f0364cf4 WatchSource:0}: Error finding container 43d63f0f20049184538f35ad824609d60bca169ae23561e9bd2dd8c3f0364cf4: Status 404 returned error can't find the container with id 43d63f0f20049184538f35ad824609d60bca169ae23561e9bd2dd8c3f0364cf4 Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.350357 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:42 crc kubenswrapper[4713]: E0308 00:09:42.351092 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:42.851073612 +0000 UTC m=+236.970705845 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.453448 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:42 crc kubenswrapper[4713]: E0308 00:09:42.453772 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:42.953752382 +0000 UTC m=+237.073384615 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.518890 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5hssk"] Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.521386 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.524911 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.525479 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5hssk"] Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.555286 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:42 crc kubenswrapper[4713]: E0308 00:09:42.555497 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:43.055477158 +0000 UTC m=+237.175109391 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.555645 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:42 crc kubenswrapper[4713]: E0308 00:09:42.556385 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:43.05636787 +0000 UTC m=+237.176000103 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.656797 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:42 crc kubenswrapper[4713]: E0308 00:09:42.656952 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:43.156922957 +0000 UTC m=+237.276555190 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.657061 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsx97\" (UniqueName: \"kubernetes.io/projected/822fdb72-7e7f-441b-8ebc-178ef46cca73-kube-api-access-bsx97\") pod \"redhat-marketplace-5hssk\" (UID: \"822fdb72-7e7f-441b-8ebc-178ef46cca73\") " pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.657112 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.657151 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/822fdb72-7e7f-441b-8ebc-178ef46cca73-utilities\") pod \"redhat-marketplace-5hssk\" (UID: \"822fdb72-7e7f-441b-8ebc-178ef46cca73\") " pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.657174 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/822fdb72-7e7f-441b-8ebc-178ef46cca73-catalog-content\") pod \"redhat-marketplace-5hssk\" (UID: \"822fdb72-7e7f-441b-8ebc-178ef46cca73\") " pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:09:42 crc kubenswrapper[4713]: E0308 00:09:42.657467 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:43.15745562 +0000 UTC m=+237.277087853 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.758631 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.758916 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsx97\" (UniqueName: \"kubernetes.io/projected/822fdb72-7e7f-441b-8ebc-178ef46cca73-kube-api-access-bsx97\") pod \"redhat-marketplace-5hssk\" (UID: \"822fdb72-7e7f-441b-8ebc-178ef46cca73\") " pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.759015 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/822fdb72-7e7f-441b-8ebc-178ef46cca73-utilities\") pod \"redhat-marketplace-5hssk\" (UID: \"822fdb72-7e7f-441b-8ebc-178ef46cca73\") " pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.759067 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/822fdb72-7e7f-441b-8ebc-178ef46cca73-catalog-content\") pod \"redhat-marketplace-5hssk\" (UID: \"822fdb72-7e7f-441b-8ebc-178ef46cca73\") " pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.759656 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/822fdb72-7e7f-441b-8ebc-178ef46cca73-catalog-content\") pod \"redhat-marketplace-5hssk\" (UID: \"822fdb72-7e7f-441b-8ebc-178ef46cca73\") " pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:09:42 crc kubenswrapper[4713]: E0308 00:09:42.759747 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:43.25972849 +0000 UTC m=+237.379360723 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.760436 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/822fdb72-7e7f-441b-8ebc-178ef46cca73-utilities\") pod \"redhat-marketplace-5hssk\" (UID: \"822fdb72-7e7f-441b-8ebc-178ef46cca73\") " pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.777114 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsx97\" (UniqueName: \"kubernetes.io/projected/822fdb72-7e7f-441b-8ebc-178ef46cca73-kube-api-access-bsx97\") pod \"redhat-marketplace-5hssk\" (UID: \"822fdb72-7e7f-441b-8ebc-178ef46cca73\") " pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.816958 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.862289 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:42 crc kubenswrapper[4713]: E0308 00:09:42.862669 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:43.362652226 +0000 UTC m=+237.482284459 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.892704 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:09:42 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 08 00:09:42 crc kubenswrapper[4713]: [+]process-running ok Mar 08 00:09:42 crc kubenswrapper[4713]: healthz check failed Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.892746 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.933899 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hs88q"] Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.934938 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.948603 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hs88q"] Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.963518 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:42 crc kubenswrapper[4713]: E0308 00:09:42.964566 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:43.464550866 +0000 UTC m=+237.584183099 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.065606 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-catalog-content\") pod \"redhat-marketplace-hs88q\" (UID: \"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0\") " pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.065975 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-utilities\") pod \"redhat-marketplace-hs88q\" (UID: \"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0\") " pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.066032 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.066091 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxjck\" (UniqueName: \"kubernetes.io/projected/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-kube-api-access-sxjck\") pod \"redhat-marketplace-hs88q\" (UID: \"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0\") " pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.066358 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:43.566347394 +0000 UTC m=+237.685979627 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.166673 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.166860 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:43.666814468 +0000 UTC m=+237.786446701 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.166980 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-utilities\") pod \"redhat-marketplace-hs88q\" (UID: \"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0\") " pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.167052 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.167099 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxjck\" (UniqueName: \"kubernetes.io/projected/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-kube-api-access-sxjck\") pod \"redhat-marketplace-hs88q\" (UID: \"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0\") " pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.167146 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-catalog-content\") pod \"redhat-marketplace-hs88q\" (UID: \"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0\") " pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.167582 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-catalog-content\") pod \"redhat-marketplace-hs88q\" (UID: \"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0\") " pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.167637 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:43.667629268 +0000 UTC m=+237.787261491 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.168029 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-utilities\") pod \"redhat-marketplace-hs88q\" (UID: \"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0\") " pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.175908 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.186751 4713 generic.go:334] "Generic (PLEG): container finished" podID="c5cc5125-93f0-4709-afbd-7aa6a888b641" containerID="a68b4ccfdfbaf91b0589175f60e09a31251dadc4c8962143c6e936d1c65c0638" exitCode=0 Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.186888 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" event={"ID":"c5cc5125-93f0-4709-afbd-7aa6a888b641","Type":"ContainerDied","Data":"a68b4ccfdfbaf91b0589175f60e09a31251dadc4c8962143c6e936d1c65c0638"} Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.186963 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" event={"ID":"c5cc5125-93f0-4709-afbd-7aa6a888b641","Type":"ContainerDied","Data":"4dcd3efc63c2bb82108f5db86db8f7d5ce1c4ffb7c4a91ed149a6c9ab7e1050e"} Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.187160 4713 scope.go:117] "RemoveContainer" containerID="a68b4ccfdfbaf91b0589175f60e09a31251dadc4c8962143c6e936d1c65c0638" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.187288 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.191627 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.192491 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxjck\" (UniqueName: \"kubernetes.io/projected/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-kube-api-access-sxjck\") pod \"redhat-marketplace-hs88q\" (UID: \"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0\") " pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.198909 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"de40fceb-b995-45d6-8272-3a93c1b85bc8","Type":"ContainerStarted","Data":"4ff9eb52dff6453e29d770097f03f20f6662ef54a0468dd632573c2f6fb34657"} Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.198953 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"de40fceb-b995-45d6-8272-3a93c1b85bc8","Type":"ContainerStarted","Data":"43d63f0f20049184538f35ad824609d60bca169ae23561e9bd2dd8c3f0364cf4"} Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.210124 4713 generic.go:334] "Generic (PLEG): container finished" podID="2a04a017-1594-43d7-a796-8c676b28095e" containerID="c8ec75cd7a186f4467889f8e0fcfe9eae850fd7f8f43899ce233be5db2fb4c2c" exitCode=0 Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.210201 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" event={"ID":"2a04a017-1594-43d7-a796-8c676b28095e","Type":"ContainerDied","Data":"c8ec75cd7a186f4467889f8e0fcfe9eae850fd7f8f43899ce233be5db2fb4c2c"} Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.219263 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-q84x9" event={"ID":"063a79dd-fbe8-4562-98bc-deb309b25182","Type":"ContainerStarted","Data":"f9994e738e641d54be6f247f3a1e0358bcb1b2e919a54e81e49a4879ccbc6546"} Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.223562 4713 generic.go:334] "Generic (PLEG): container finished" podID="40864d72-e137-478e-8340-8c0f107b4c60" containerID="b521ece8028ebf9207946445f9aecae87b7e5c6d252fd707c34dc0276256c2c0" exitCode=0 Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.223727 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4tj99" event={"ID":"40864d72-e137-478e-8340-8c0f107b4c60","Type":"ContainerDied","Data":"b521ece8028ebf9207946445f9aecae87b7e5c6d252fd707c34dc0276256c2c0"} Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.226815 4713 scope.go:117] "RemoveContainer" containerID="a68b4ccfdfbaf91b0589175f60e09a31251dadc4c8962143c6e936d1c65c0638" Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.228125 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a68b4ccfdfbaf91b0589175f60e09a31251dadc4c8962143c6e936d1c65c0638\": container with ID starting with a68b4ccfdfbaf91b0589175f60e09a31251dadc4c8962143c6e936d1c65c0638 not found: ID does not exist" containerID="a68b4ccfdfbaf91b0589175f60e09a31251dadc4c8962143c6e936d1c65c0638" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.228155 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a68b4ccfdfbaf91b0589175f60e09a31251dadc4c8962143c6e936d1c65c0638"} err="failed to get container status \"a68b4ccfdfbaf91b0589175f60e09a31251dadc4c8962143c6e936d1c65c0638\": rpc error: code = NotFound desc = could not find container \"a68b4ccfdfbaf91b0589175f60e09a31251dadc4c8962143c6e936d1c65c0638\": container with ID starting with a68b4ccfdfbaf91b0589175f60e09a31251dadc4c8962143c6e936d1c65c0638 not found: ID does not exist" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.233284 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.233199706 podStartE2EDuration="3.233199706s" podCreationTimestamp="2026-03-08 00:09:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:43.232764575 +0000 UTC m=+237.352396808" watchObservedRunningTime="2026-03-08 00:09:43.233199706 +0000 UTC m=+237.352831939" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.244965 4713 generic.go:334] "Generic (PLEG): container finished" podID="c33b42a1-bf95-490f-a907-765855ec81d1" containerID="f219be814b1ac8475a83125ee5f48f62c739076f91025a6595fb3c6cc2132578" exitCode=0 Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.245065 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x7pkf" event={"ID":"c33b42a1-bf95-490f-a907-765855ec81d1","Type":"ContainerDied","Data":"f219be814b1ac8475a83125ee5f48f62c739076f91025a6595fb3c6cc2132578"} Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.245093 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x7pkf" event={"ID":"c33b42a1-bf95-490f-a907-765855ec81d1","Type":"ContainerStarted","Data":"8b84966b96c0ed6376bfb58ebe4d50727b2f7c4a888ad1b3e8b431d7574ba8b4"} Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.258733 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.262172 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"64aa73b3-797b-405e-b2ca-db772f204659","Type":"ContainerStarted","Data":"ecd142315e97875bdcb7f48882fb2a26c6170c9668052fcb6053cd5ffcce8723"} Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.262241 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"64aa73b3-797b-405e-b2ca-db772f204659","Type":"ContainerStarted","Data":"55ed937bc6c9076c3c9e0296b5b1c3572c62f9313c2870371202bf79e0d60ff8"} Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.265779 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5hssk"] Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.267650 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5cc5125-93f0-4709-afbd-7aa6a888b641-client-ca\") pod \"c5cc5125-93f0-4709-afbd-7aa6a888b641\" (UID: \"c5cc5125-93f0-4709-afbd-7aa6a888b641\") " Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.267711 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-serving-cert\") pod \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.267741 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-proxy-ca-bundles\") pod \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.267765 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5cc5125-93f0-4709-afbd-7aa6a888b641-serving-cert\") pod \"c5cc5125-93f0-4709-afbd-7aa6a888b641\" (UID: \"c5cc5125-93f0-4709-afbd-7aa6a888b641\") " Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.267809 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-client-ca\") pod \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.267854 4713 generic.go:334] "Generic (PLEG): container finished" podID="e4ba1fb6-83e1-4a29-93a5-5abf00f86718" containerID="9536e9b3624c06646894a8bbf0b9ca445d2a94426c01c655b1f4a1a1e29602ba" exitCode=0 Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.267909 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" event={"ID":"e4ba1fb6-83e1-4a29-93a5-5abf00f86718","Type":"ContainerDied","Data":"9536e9b3624c06646894a8bbf0b9ca445d2a94426c01c655b1f4a1a1e29602ba"} Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.267926 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.267940 4713 scope.go:117] "RemoveContainer" containerID="9536e9b3624c06646894a8bbf0b9ca445d2a94426c01c655b1f4a1a1e29602ba" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.267970 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5cc5125-93f0-4709-afbd-7aa6a888b641-config\") pod \"c5cc5125-93f0-4709-afbd-7aa6a888b641\" (UID: \"c5cc5125-93f0-4709-afbd-7aa6a888b641\") " Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.268009 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-config\") pod \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.268047 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzcz5\" (UniqueName: \"kubernetes.io/projected/c5cc5125-93f0-4709-afbd-7aa6a888b641-kube-api-access-fzcz5\") pod \"c5cc5125-93f0-4709-afbd-7aa6a888b641\" (UID: \"c5cc5125-93f0-4709-afbd-7aa6a888b641\") " Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.268125 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-549nc\" (UniqueName: \"kubernetes.io/projected/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-kube-api-access-549nc\") pod \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.268706 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-client-ca" (OuterVolumeSpecName: "client-ca") pod "e4ba1fb6-83e1-4a29-93a5-5abf00f86718" (UID: "e4ba1fb6-83e1-4a29-93a5-5abf00f86718"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.269161 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5cc5125-93f0-4709-afbd-7aa6a888b641-client-ca" (OuterVolumeSpecName: "client-ca") pod "c5cc5125-93f0-4709-afbd-7aa6a888b641" (UID: "c5cc5125-93f0-4709-afbd-7aa6a888b641"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.269604 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:43.76958413 +0000 UTC m=+237.889216433 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.267928 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" event={"ID":"e4ba1fb6-83e1-4a29-93a5-5abf00f86718","Type":"ContainerDied","Data":"a48c3b313279a8d19f79d36e4fdb5a5265b310ba5fe079364f758a6f08817617"} Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.270176 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5cc5125-93f0-4709-afbd-7aa6a888b641-config" (OuterVolumeSpecName: "config") pod "c5cc5125-93f0-4709-afbd-7aa6a888b641" (UID: "c5cc5125-93f0-4709-afbd-7aa6a888b641"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.271137 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e4ba1fb6-83e1-4a29-93a5-5abf00f86718" (UID: "e4ba1fb6-83e1-4a29-93a5-5abf00f86718"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.274938 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-config" (OuterVolumeSpecName: "config") pod "e4ba1fb6-83e1-4a29-93a5-5abf00f86718" (UID: "e4ba1fb6-83e1-4a29-93a5-5abf00f86718"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.276504 4713 generic.go:334] "Generic (PLEG): container finished" podID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" containerID="10f6a682f68f33f52b960986a98e4b9b4d5d737c5be6429ad3ce071e85a28622" exitCode=0 Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.276547 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pd9br" event={"ID":"cd4a956b-6edb-436e-bd5e-5d57899c2ea1","Type":"ContainerDied","Data":"10f6a682f68f33f52b960986a98e4b9b4d5d737c5be6429ad3ce071e85a28622"} Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.276593 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pd9br" event={"ID":"cd4a956b-6edb-436e-bd5e-5d57899c2ea1","Type":"ContainerStarted","Data":"135e656a965d1b87bbb089b3e89dbd03d0497fd3df39d718203e4d15ec7454b9"} Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.268013 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.282299 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e4ba1fb6-83e1-4a29-93a5-5abf00f86718" (UID: "e4ba1fb6-83e1-4a29-93a5-5abf00f86718"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.282557 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5cc5125-93f0-4709-afbd-7aa6a888b641-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c5cc5125-93f0-4709-afbd-7aa6a888b641" (UID: "c5cc5125-93f0-4709-afbd-7aa6a888b641"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.282633 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5cc5125-93f0-4709-afbd-7aa6a888b641-kube-api-access-fzcz5" (OuterVolumeSpecName: "kube-api-access-fzcz5") pod "c5cc5125-93f0-4709-afbd-7aa6a888b641" (UID: "c5cc5125-93f0-4709-afbd-7aa6a888b641"). InnerVolumeSpecName "kube-api-access-fzcz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.287329 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-kube-api-access-549nc" (OuterVolumeSpecName: "kube-api-access-549nc") pod "e4ba1fb6-83e1-4a29-93a5-5abf00f86718" (UID: "e4ba1fb6-83e1-4a29-93a5-5abf00f86718"). InnerVolumeSpecName "kube-api-access-549nc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:09:43 crc kubenswrapper[4713]: W0308 00:09:43.291229 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod822fdb72_7e7f_441b_8ebc_178ef46cca73.slice/crio-fcc1f03f798c9a1497a249637518dbb0a71923b3eba6d35aa4080c621862fa0f WatchSource:0}: Error finding container fcc1f03f798c9a1497a249637518dbb0a71923b3eba6d35aa4080c621862fa0f: Status 404 returned error can't find the container with id fcc1f03f798c9a1497a249637518dbb0a71923b3eba6d35aa4080c621862fa0f Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.306022 4713 scope.go:117] "RemoveContainer" containerID="9536e9b3624c06646894a8bbf0b9ca445d2a94426c01c655b1f4a1a1e29602ba" Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.307154 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9536e9b3624c06646894a8bbf0b9ca445d2a94426c01c655b1f4a1a1e29602ba\": container with ID starting with 9536e9b3624c06646894a8bbf0b9ca445d2a94426c01c655b1f4a1a1e29602ba not found: ID does not exist" containerID="9536e9b3624c06646894a8bbf0b9ca445d2a94426c01c655b1f4a1a1e29602ba" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.307178 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9536e9b3624c06646894a8bbf0b9ca445d2a94426c01c655b1f4a1a1e29602ba"} err="failed to get container status \"9536e9b3624c06646894a8bbf0b9ca445d2a94426c01c655b1f4a1a1e29602ba\": rpc error: code = NotFound desc = could not find container \"9536e9b3624c06646894a8bbf0b9ca445d2a94426c01c655b1f4a1a1e29602ba\": container with ID starting with 9536e9b3624c06646894a8bbf0b9ca445d2a94426c01c655b1f4a1a1e29602ba not found: ID does not exist" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.318167 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.31812236 podStartE2EDuration="3.31812236s" podCreationTimestamp="2026-03-08 00:09:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:43.3181387 +0000 UTC m=+237.437770943" watchObservedRunningTime="2026-03-08 00:09:43.31812236 +0000 UTC m=+237.437754593" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.369650 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.369863 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-549nc\" (UniqueName: \"kubernetes.io/projected/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-kube-api-access-549nc\") on node \"crc\" DevicePath \"\"" Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.370301 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:43.870066145 +0000 UTC m=+237.989698378 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.370453 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5cc5125-93f0-4709-afbd-7aa6a888b641-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.370488 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.370499 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.370512 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5cc5125-93f0-4709-afbd-7aa6a888b641-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.370521 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.370529 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5cc5125-93f0-4709-afbd-7aa6a888b641-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.370539 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.370549 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzcz5\" (UniqueName: \"kubernetes.io/projected/c5cc5125-93f0-4709-afbd-7aa6a888b641-kube-api-access-fzcz5\") on node \"crc\" DevicePath \"\"" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.381889 4713 ???:1] "http: TLS handshake error from 192.168.126.11:56422: no serving certificate available for the kubelet" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.471469 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.471586 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:43.971566966 +0000 UTC m=+238.091199209 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.472211 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.472512 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:43.972502009 +0000 UTC m=+238.092134242 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.517361 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7"] Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.519922 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7"] Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.573495 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.573652 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:44.07362908 +0000 UTC m=+238.193261313 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.573776 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.574051 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:44.07403995 +0000 UTC m=+238.193672173 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.604038 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4xznw"] Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.606435 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4xznw"] Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.674773 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.675232 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:44.175214262 +0000 UTC m=+238.294846495 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.694378 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hs88q"] Mar 08 00:09:43 crc kubenswrapper[4713]: W0308 00:09:43.769567 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ef0ec0c_d1f7_4ed1_81d8_fe12497c15b0.slice/crio-6fcd739b02f335d950276fc5d35bedd4422940f74a80db12ae1da2ebc8d7061a WatchSource:0}: Error finding container 6fcd739b02f335d950276fc5d35bedd4422940f74a80db12ae1da2ebc8d7061a: Status 404 returned error can't find the container with id 6fcd739b02f335d950276fc5d35bedd4422940f74a80db12ae1da2ebc8d7061a Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.775905 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.776193 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:44.276180739 +0000 UTC m=+238.395812972 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.877088 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.877308 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:44.37727731 +0000 UTC m=+238.496909543 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.877426 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.877797 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:44.377784672 +0000 UTC m=+238.497416985 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.889171 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:09:43 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 08 00:09:43 crc kubenswrapper[4713]: [+]process-running ok Mar 08 00:09:43 crc kubenswrapper[4713]: healthz check failed Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.889504 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.909503 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-57pjt"] Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.910343 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5cc5125-93f0-4709-afbd-7aa6a888b641" containerName="route-controller-manager" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.910363 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5cc5125-93f0-4709-afbd-7aa6a888b641" containerName="route-controller-manager" Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.910402 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4ba1fb6-83e1-4a29-93a5-5abf00f86718" containerName="controller-manager" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.910412 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4ba1fb6-83e1-4a29-93a5-5abf00f86718" containerName="controller-manager" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.910747 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5cc5125-93f0-4709-afbd-7aa6a888b641" containerName="route-controller-manager" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.910774 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4ba1fb6-83e1-4a29-93a5-5abf00f86718" containerName="controller-manager" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.913450 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.915803 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.924866 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-57pjt"] Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.978587 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.978817 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfdss\" (UniqueName: \"kubernetes.io/projected/e23a30a2-2bf8-451e-b85b-b293e8949e9e-kube-api-access-kfdss\") pod \"redhat-operators-57pjt\" (UID: \"e23a30a2-2bf8-451e-b85b-b293e8949e9e\") " pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.978874 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e23a30a2-2bf8-451e-b85b-b293e8949e9e-utilities\") pod \"redhat-operators-57pjt\" (UID: \"e23a30a2-2bf8-451e-b85b-b293e8949e9e\") " pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.978953 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:44.478905563 +0000 UTC m=+238.598537796 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.979120 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e23a30a2-2bf8-451e-b85b-b293e8949e9e-catalog-content\") pod \"redhat-operators-57pjt\" (UID: \"e23a30a2-2bf8-451e-b85b-b293e8949e9e\") " pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.080609 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfdss\" (UniqueName: \"kubernetes.io/projected/e23a30a2-2bf8-451e-b85b-b293e8949e9e-kube-api-access-kfdss\") pod \"redhat-operators-57pjt\" (UID: \"e23a30a2-2bf8-451e-b85b-b293e8949e9e\") " pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.080693 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e23a30a2-2bf8-451e-b85b-b293e8949e9e-utilities\") pod \"redhat-operators-57pjt\" (UID: \"e23a30a2-2bf8-451e-b85b-b293e8949e9e\") " pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.080741 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.080819 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e23a30a2-2bf8-451e-b85b-b293e8949e9e-catalog-content\") pod \"redhat-operators-57pjt\" (UID: \"e23a30a2-2bf8-451e-b85b-b293e8949e9e\") " pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.081492 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e23a30a2-2bf8-451e-b85b-b293e8949e9e-catalog-content\") pod \"redhat-operators-57pjt\" (UID: \"e23a30a2-2bf8-451e-b85b-b293e8949e9e\") " pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.082321 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e23a30a2-2bf8-451e-b85b-b293e8949e9e-utilities\") pod \"redhat-operators-57pjt\" (UID: \"e23a30a2-2bf8-451e-b85b-b293e8949e9e\") " pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:09:44 crc kubenswrapper[4713]: E0308 00:09:44.082428 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:44.582411494 +0000 UTC m=+238.702043727 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.100153 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfdss\" (UniqueName: \"kubernetes.io/projected/e23a30a2-2bf8-451e-b85b-b293e8949e9e-kube-api-access-kfdss\") pod \"redhat-operators-57pjt\" (UID: \"e23a30a2-2bf8-451e-b85b-b293e8949e9e\") " pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.182406 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:44 crc kubenswrapper[4713]: E0308 00:09:44.182545 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:44.682528799 +0000 UTC m=+238.802161032 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.182755 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:44 crc kubenswrapper[4713]: E0308 00:09:44.183016 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:44.683007631 +0000 UTC m=+238.802639854 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.237397 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.271714 4713 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.283607 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:44 crc kubenswrapper[4713]: E0308 00:09:44.283747 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:44.783722342 +0000 UTC m=+238.903354575 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.283896 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:44 crc kubenswrapper[4713]: E0308 00:09:44.284168 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:44.784158053 +0000 UTC m=+238.903790276 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.286710 4713 generic.go:334] "Generic (PLEG): container finished" podID="de40fceb-b995-45d6-8272-3a93c1b85bc8" containerID="4ff9eb52dff6453e29d770097f03f20f6662ef54a0468dd632573c2f6fb34657" exitCode=0 Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.286764 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"de40fceb-b995-45d6-8272-3a93c1b85bc8","Type":"ContainerDied","Data":"4ff9eb52dff6453e29d770097f03f20f6662ef54a0468dd632573c2f6fb34657"} Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.288143 4713 generic.go:334] "Generic (PLEG): container finished" podID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" containerID="30fcbfe0635451c7fd3955c62a769f92ccede7936e36fa38580a85369fc7d85d" exitCode=0 Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.288187 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hs88q" event={"ID":"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0","Type":"ContainerDied","Data":"30fcbfe0635451c7fd3955c62a769f92ccede7936e36fa38580a85369fc7d85d"} Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.288202 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hs88q" event={"ID":"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0","Type":"ContainerStarted","Data":"6fcd739b02f335d950276fc5d35bedd4422940f74a80db12ae1da2ebc8d7061a"} Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.290284 4713 generic.go:334] "Generic (PLEG): container finished" podID="64aa73b3-797b-405e-b2ca-db772f204659" containerID="ecd142315e97875bdcb7f48882fb2a26c6170c9668052fcb6053cd5ffcce8723" exitCode=0 Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.290365 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"64aa73b3-797b-405e-b2ca-db772f204659","Type":"ContainerDied","Data":"ecd142315e97875bdcb7f48882fb2a26c6170c9668052fcb6053cd5ffcce8723"} Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.293578 4713 generic.go:334] "Generic (PLEG): container finished" podID="822fdb72-7e7f-441b-8ebc-178ef46cca73" containerID="fa81935375891e84987b059dfdea9629b743e60a7365748b113fb9a50d109ab1" exitCode=0 Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.293640 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5hssk" event={"ID":"822fdb72-7e7f-441b-8ebc-178ef46cca73","Type":"ContainerDied","Data":"fa81935375891e84987b059dfdea9629b743e60a7365748b113fb9a50d109ab1"} Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.293656 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5hssk" event={"ID":"822fdb72-7e7f-441b-8ebc-178ef46cca73","Type":"ContainerStarted","Data":"fcc1f03f798c9a1497a249637518dbb0a71923b3eba6d35aa4080c621862fa0f"} Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.297404 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-q84x9" event={"ID":"063a79dd-fbe8-4562-98bc-deb309b25182","Type":"ContainerStarted","Data":"7f7d7a7a5f5312cb47aeedd31881890eb92d61d686058c3f78862dbedd1bf7b0"} Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.321583 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rdgpc"] Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.333367 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rdgpc"] Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.333814 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.335593 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8"] Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.337165 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.339412 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.340126 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.340254 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.340256 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp"] Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.341202 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.343537 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.347096 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.347110 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.347111 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.347323 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.347489 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.347506 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.347617 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.347726 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.351246 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.353412 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8"] Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.356594 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp"] Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.385566 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:44 crc kubenswrapper[4713]: E0308 00:09:44.387179 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:44.887157591 +0000 UTC m=+239.006789834 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.487478 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmk7f\" (UniqueName: \"kubernetes.io/projected/dcde95f7-8814-4319-8a48-6d186de5f51f-kube-api-access-nmk7f\") pod \"redhat-operators-rdgpc\" (UID: \"dcde95f7-8814-4319-8a48-6d186de5f51f\") " pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.487519 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abef8d7b-3e23-43e9-96d4-3227bcc16048-serving-cert\") pod \"controller-manager-6c6f4b84f7-f59s8\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.487550 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcde95f7-8814-4319-8a48-6d186de5f51f-utilities\") pod \"redhat-operators-rdgpc\" (UID: \"dcde95f7-8814-4319-8a48-6d186de5f51f\") " pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.487617 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-proxy-ca-bundles\") pod \"controller-manager-6c6f4b84f7-f59s8\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.487669 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-config\") pod \"route-controller-manager-857fc9cd49-86dkp\" (UID: \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\") " pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.487791 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-client-ca\") pod \"route-controller-manager-857fc9cd49-86dkp\" (UID: \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\") " pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.487840 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5w5j\" (UniqueName: \"kubernetes.io/projected/abef8d7b-3e23-43e9-96d4-3227bcc16048-kube-api-access-g5w5j\") pod \"controller-manager-6c6f4b84f7-f59s8\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.487865 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcde95f7-8814-4319-8a48-6d186de5f51f-catalog-content\") pod \"redhat-operators-rdgpc\" (UID: \"dcde95f7-8814-4319-8a48-6d186de5f51f\") " pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.487889 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-serving-cert\") pod \"route-controller-manager-857fc9cd49-86dkp\" (UID: \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\") " pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.487930 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-client-ca\") pod \"controller-manager-6c6f4b84f7-f59s8\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.487965 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.488036 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-config\") pod \"controller-manager-6c6f4b84f7-f59s8\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.488133 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jg8b\" (UniqueName: \"kubernetes.io/projected/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-kube-api-access-5jg8b\") pod \"route-controller-manager-857fc9cd49-86dkp\" (UID: \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\") " pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:09:44 crc kubenswrapper[4713]: E0308 00:09:44.488241 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:44.988226481 +0000 UTC m=+239.107858784 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.548318 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5cc5125-93f0-4709-afbd-7aa6a888b641" path="/var/lib/kubelet/pods/c5cc5125-93f0-4709-afbd-7aa6a888b641/volumes" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.549068 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4ba1fb6-83e1-4a29-93a5-5abf00f86718" path="/var/lib/kubelet/pods/e4ba1fb6-83e1-4a29-93a5-5abf00f86718/volumes" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.589295 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:44 crc kubenswrapper[4713]: E0308 00:09:44.589536 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:45.089498564 +0000 UTC m=+239.209130797 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.589566 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-serving-cert\") pod \"route-controller-manager-857fc9cd49-86dkp\" (UID: \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\") " pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.589604 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-client-ca\") pod \"controller-manager-6c6f4b84f7-f59s8\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.589630 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.589650 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-config\") pod \"controller-manager-6c6f4b84f7-f59s8\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.589686 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jg8b\" (UniqueName: \"kubernetes.io/projected/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-kube-api-access-5jg8b\") pod \"route-controller-manager-857fc9cd49-86dkp\" (UID: \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\") " pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.589719 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmk7f\" (UniqueName: \"kubernetes.io/projected/dcde95f7-8814-4319-8a48-6d186de5f51f-kube-api-access-nmk7f\") pod \"redhat-operators-rdgpc\" (UID: \"dcde95f7-8814-4319-8a48-6d186de5f51f\") " pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.589744 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abef8d7b-3e23-43e9-96d4-3227bcc16048-serving-cert\") pod \"controller-manager-6c6f4b84f7-f59s8\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.589774 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcde95f7-8814-4319-8a48-6d186de5f51f-utilities\") pod \"redhat-operators-rdgpc\" (UID: \"dcde95f7-8814-4319-8a48-6d186de5f51f\") " pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.589800 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-proxy-ca-bundles\") pod \"controller-manager-6c6f4b84f7-f59s8\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.589863 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-config\") pod \"route-controller-manager-857fc9cd49-86dkp\" (UID: \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\") " pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.589890 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-client-ca\") pod \"route-controller-manager-857fc9cd49-86dkp\" (UID: \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\") " pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.589915 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5w5j\" (UniqueName: \"kubernetes.io/projected/abef8d7b-3e23-43e9-96d4-3227bcc16048-kube-api-access-g5w5j\") pod \"controller-manager-6c6f4b84f7-f59s8\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.589936 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcde95f7-8814-4319-8a48-6d186de5f51f-catalog-content\") pod \"redhat-operators-rdgpc\" (UID: \"dcde95f7-8814-4319-8a48-6d186de5f51f\") " pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.590355 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcde95f7-8814-4319-8a48-6d186de5f51f-catalog-content\") pod \"redhat-operators-rdgpc\" (UID: \"dcde95f7-8814-4319-8a48-6d186de5f51f\") " pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:09:44 crc kubenswrapper[4713]: E0308 00:09:44.590899 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:45.090885849 +0000 UTC m=+239.210518162 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.591264 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-client-ca\") pod \"route-controller-manager-857fc9cd49-86dkp\" (UID: \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\") " pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.591809 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-client-ca\") pod \"controller-manager-6c6f4b84f7-f59s8\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.591809 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-proxy-ca-bundles\") pod \"controller-manager-6c6f4b84f7-f59s8\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.592211 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-config\") pod \"controller-manager-6c6f4b84f7-f59s8\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.592145 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-config\") pod \"route-controller-manager-857fc9cd49-86dkp\" (UID: \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\") " pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.593032 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcde95f7-8814-4319-8a48-6d186de5f51f-utilities\") pod \"redhat-operators-rdgpc\" (UID: \"dcde95f7-8814-4319-8a48-6d186de5f51f\") " pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.605067 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abef8d7b-3e23-43e9-96d4-3227bcc16048-serving-cert\") pod \"controller-manager-6c6f4b84f7-f59s8\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.607304 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5w5j\" (UniqueName: \"kubernetes.io/projected/abef8d7b-3e23-43e9-96d4-3227bcc16048-kube-api-access-g5w5j\") pod \"controller-manager-6c6f4b84f7-f59s8\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.607872 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jg8b\" (UniqueName: \"kubernetes.io/projected/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-kube-api-access-5jg8b\") pod \"route-controller-manager-857fc9cd49-86dkp\" (UID: \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\") " pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.608992 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-serving-cert\") pod \"route-controller-manager-857fc9cd49-86dkp\" (UID: \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\") " pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.609475 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmk7f\" (UniqueName: \"kubernetes.io/projected/dcde95f7-8814-4319-8a48-6d186de5f51f-kube-api-access-nmk7f\") pod \"redhat-operators-rdgpc\" (UID: \"dcde95f7-8814-4319-8a48-6d186de5f51f\") " pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.665068 4713 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-08T00:09:44.271738831Z","Handler":null,"Name":""} Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.667443 4713 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.667478 4713 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.690730 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.691597 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.700207 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.711395 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.719445 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.775725 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.781881 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.792552 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.828292 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.828433 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.898629 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:09:44 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 08 00:09:44 crc kubenswrapper[4713]: [+]process-running ok Mar 08 00:09:44 crc kubenswrapper[4713]: healthz check failed Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.898704 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.900986 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:45 crc kubenswrapper[4713]: I0308 00:09:45.064886 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:45 crc kubenswrapper[4713]: I0308 00:09:45.307745 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-q84x9" event={"ID":"063a79dd-fbe8-4562-98bc-deb309b25182","Type":"ContainerStarted","Data":"e8a9049253a3fc1792b0ad8eaa854121335515ac080505e4b1d64d009bd0e53e"} Mar 08 00:09:45 crc kubenswrapper[4713]: I0308 00:09:45.332213 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-q84x9" podStartSLOduration=17.332193306 podStartE2EDuration="17.332193306s" podCreationTimestamp="2026-03-08 00:09:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:45.329726844 +0000 UTC m=+239.449359107" watchObservedRunningTime="2026-03-08 00:09:45.332193306 +0000 UTC m=+239.451825549" Mar 08 00:09:45 crc kubenswrapper[4713]: I0308 00:09:45.888505 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:09:45 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 08 00:09:45 crc kubenswrapper[4713]: [+]process-running ok Mar 08 00:09:45 crc kubenswrapper[4713]: healthz check failed Mar 08 00:09:45 crc kubenswrapper[4713]: I0308 00:09:45.888589 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:09:45 crc kubenswrapper[4713]: I0308 00:09:45.965354 4713 ???:1] "http: TLS handshake error from 192.168.126.11:56434: no serving certificate available for the kubelet" Mar 08 00:09:46 crc kubenswrapper[4713]: I0308 00:09:46.119677 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-lwhnh" Mar 08 00:09:46 crc kubenswrapper[4713]: I0308 00:09:46.553976 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 08 00:09:46 crc kubenswrapper[4713]: I0308 00:09:46.890553 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:09:46 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 08 00:09:46 crc kubenswrapper[4713]: [+]process-running ok Mar 08 00:09:46 crc kubenswrapper[4713]: healthz check failed Mar 08 00:09:46 crc kubenswrapper[4713]: I0308 00:09:46.890606 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:09:47 crc kubenswrapper[4713]: I0308 00:09:47.888192 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:09:47 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 08 00:09:47 crc kubenswrapper[4713]: [+]process-running ok Mar 08 00:09:47 crc kubenswrapper[4713]: healthz check failed Mar 08 00:09:47 crc kubenswrapper[4713]: I0308 00:09:47.888497 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:09:48 crc kubenswrapper[4713]: I0308 00:09:48.155001 4713 ???:1] "http: TLS handshake error from 192.168.126.11:56440: no serving certificate available for the kubelet" Mar 08 00:09:48 crc kubenswrapper[4713]: I0308 00:09:48.874294 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 00:09:48 crc kubenswrapper[4713]: I0308 00:09:48.882138 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" Mar 08 00:09:48 crc kubenswrapper[4713]: I0308 00:09:48.889020 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:09:48 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 08 00:09:48 crc kubenswrapper[4713]: [+]process-running ok Mar 08 00:09:48 crc kubenswrapper[4713]: healthz check failed Mar 08 00:09:48 crc kubenswrapper[4713]: I0308 00:09:48.889060 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:09:48 crc kubenswrapper[4713]: I0308 00:09:48.901198 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 00:09:48 crc kubenswrapper[4713]: I0308 00:09:48.969335 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de40fceb-b995-45d6-8272-3a93c1b85bc8-kube-api-access\") pod \"de40fceb-b995-45d6-8272-3a93c1b85bc8\" (UID: \"de40fceb-b995-45d6-8272-3a93c1b85bc8\") " Mar 08 00:09:48 crc kubenswrapper[4713]: I0308 00:09:48.969390 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l55j\" (UniqueName: \"kubernetes.io/projected/2a04a017-1594-43d7-a796-8c676b28095e-kube-api-access-5l55j\") pod \"2a04a017-1594-43d7-a796-8c676b28095e\" (UID: \"2a04a017-1594-43d7-a796-8c676b28095e\") " Mar 08 00:09:48 crc kubenswrapper[4713]: I0308 00:09:48.969412 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a04a017-1594-43d7-a796-8c676b28095e-secret-volume\") pod \"2a04a017-1594-43d7-a796-8c676b28095e\" (UID: \"2a04a017-1594-43d7-a796-8c676b28095e\") " Mar 08 00:09:48 crc kubenswrapper[4713]: I0308 00:09:48.969464 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a04a017-1594-43d7-a796-8c676b28095e-config-volume\") pod \"2a04a017-1594-43d7-a796-8c676b28095e\" (UID: \"2a04a017-1594-43d7-a796-8c676b28095e\") " Mar 08 00:09:48 crc kubenswrapper[4713]: I0308 00:09:48.969610 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de40fceb-b995-45d6-8272-3a93c1b85bc8-kubelet-dir\") pod \"de40fceb-b995-45d6-8272-3a93c1b85bc8\" (UID: \"de40fceb-b995-45d6-8272-3a93c1b85bc8\") " Mar 08 00:09:48 crc kubenswrapper[4713]: I0308 00:09:48.969913 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de40fceb-b995-45d6-8272-3a93c1b85bc8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "de40fceb-b995-45d6-8272-3a93c1b85bc8" (UID: "de40fceb-b995-45d6-8272-3a93c1b85bc8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:09:48 crc kubenswrapper[4713]: I0308 00:09:48.970420 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a04a017-1594-43d7-a796-8c676b28095e-config-volume" (OuterVolumeSpecName: "config-volume") pod "2a04a017-1594-43d7-a796-8c676b28095e" (UID: "2a04a017-1594-43d7-a796-8c676b28095e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:09:48 crc kubenswrapper[4713]: I0308 00:09:48.975523 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a04a017-1594-43d7-a796-8c676b28095e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2a04a017-1594-43d7-a796-8c676b28095e" (UID: "2a04a017-1594-43d7-a796-8c676b28095e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:09:48 crc kubenswrapper[4713]: I0308 00:09:48.975860 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a04a017-1594-43d7-a796-8c676b28095e-kube-api-access-5l55j" (OuterVolumeSpecName: "kube-api-access-5l55j") pod "2a04a017-1594-43d7-a796-8c676b28095e" (UID: "2a04a017-1594-43d7-a796-8c676b28095e"). InnerVolumeSpecName "kube-api-access-5l55j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:09:48 crc kubenswrapper[4713]: I0308 00:09:48.976059 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de40fceb-b995-45d6-8272-3a93c1b85bc8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "de40fceb-b995-45d6-8272-3a93c1b85bc8" (UID: "de40fceb-b995-45d6-8272-3a93c1b85bc8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.070487 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/64aa73b3-797b-405e-b2ca-db772f204659-kubelet-dir\") pod \"64aa73b3-797b-405e-b2ca-db772f204659\" (UID: \"64aa73b3-797b-405e-b2ca-db772f204659\") " Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.071002 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/64aa73b3-797b-405e-b2ca-db772f204659-kube-api-access\") pod \"64aa73b3-797b-405e-b2ca-db772f204659\" (UID: \"64aa73b3-797b-405e-b2ca-db772f204659\") " Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.071393 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de40fceb-b995-45d6-8272-3a93c1b85bc8-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.071421 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l55j\" (UniqueName: \"kubernetes.io/projected/2a04a017-1594-43d7-a796-8c676b28095e-kube-api-access-5l55j\") on node \"crc\" DevicePath \"\"" Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.071433 4713 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a04a017-1594-43d7-a796-8c676b28095e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.071445 4713 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a04a017-1594-43d7-a796-8c676b28095e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.071455 4713 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de40fceb-b995-45d6-8272-3a93c1b85bc8-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.070617 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64aa73b3-797b-405e-b2ca-db772f204659-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "64aa73b3-797b-405e-b2ca-db772f204659" (UID: "64aa73b3-797b-405e-b2ca-db772f204659"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.074171 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64aa73b3-797b-405e-b2ca-db772f204659-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "64aa73b3-797b-405e-b2ca-db772f204659" (UID: "64aa73b3-797b-405e-b2ca-db772f204659"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.172710 4713 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/64aa73b3-797b-405e-b2ca-db772f204659-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.172749 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/64aa73b3-797b-405e-b2ca-db772f204659-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.337001 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.337022 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" event={"ID":"2a04a017-1594-43d7-a796-8c676b28095e","Type":"ContainerDied","Data":"f170f29d26ed2ed2fc88befac7041785958542192c67ab73459f56dea209da08"} Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.337352 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f170f29d26ed2ed2fc88befac7041785958542192c67ab73459f56dea209da08" Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.341760 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.341763 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"de40fceb-b995-45d6-8272-3a93c1b85bc8","Type":"ContainerDied","Data":"43d63f0f20049184538f35ad824609d60bca169ae23561e9bd2dd8c3f0364cf4"} Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.342014 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43d63f0f20049184538f35ad824609d60bca169ae23561e9bd2dd8c3f0364cf4" Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.343479 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"64aa73b3-797b-405e-b2ca-db772f204659","Type":"ContainerDied","Data":"55ed937bc6c9076c3c9e0296b5b1c3572c62f9313c2870371202bf79e0d60ff8"} Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.343506 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.343509 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55ed937bc6c9076c3c9e0296b5b1c3572c62f9313c2870371202bf79e0d60ff8" Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.888782 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:09:49 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 08 00:09:49 crc kubenswrapper[4713]: [+]process-running ok Mar 08 00:09:49 crc kubenswrapper[4713]: healthz check failed Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.888871 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:09:50 crc kubenswrapper[4713]: I0308 00:09:50.300977 4713 patch_prober.go:28] interesting pod/console-f9d7485db-gk97q container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.27:8443/health\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Mar 08 00:09:50 crc kubenswrapper[4713]: I0308 00:09:50.301396 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-gk97q" podUID="1d068555-56f2-4bcf-8b4c-cc574ad087fa" containerName="console" probeResult="failure" output="Get \"https://10.217.0.27:8443/health\": dial tcp 10.217.0.27:8443: connect: connection refused" Mar 08 00:09:50 crc kubenswrapper[4713]: I0308 00:09:50.341356 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-z4s84 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 08 00:09:50 crc kubenswrapper[4713]: I0308 00:09:50.341408 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-z4s84 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 08 00:09:50 crc kubenswrapper[4713]: I0308 00:09:50.341431 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 08 00:09:50 crc kubenswrapper[4713]: I0308 00:09:50.341470 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 08 00:09:50 crc kubenswrapper[4713]: I0308 00:09:50.889478 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:09:50 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 08 00:09:50 crc kubenswrapper[4713]: [+]process-running ok Mar 08 00:09:50 crc kubenswrapper[4713]: healthz check failed Mar 08 00:09:50 crc kubenswrapper[4713]: I0308 00:09:50.889544 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:09:51 crc kubenswrapper[4713]: I0308 00:09:51.105077 4713 ???:1] "http: TLS handshake error from 192.168.126.11:56454: no serving certificate available for the kubelet" Mar 08 00:09:51 crc kubenswrapper[4713]: I0308 00:09:51.889064 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:09:51 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 08 00:09:51 crc kubenswrapper[4713]: [+]process-running ok Mar 08 00:09:51 crc kubenswrapper[4713]: healthz check failed Mar 08 00:09:51 crc kubenswrapper[4713]: I0308 00:09:51.889359 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:09:52 crc kubenswrapper[4713]: I0308 00:09:52.888650 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:09:52 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 08 00:09:52 crc kubenswrapper[4713]: [+]process-running ok Mar 08 00:09:52 crc kubenswrapper[4713]: healthz check failed Mar 08 00:09:52 crc kubenswrapper[4713]: I0308 00:09:52.888706 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:09:54 crc kubenswrapper[4713]: I0308 00:09:54.010408 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:54 crc kubenswrapper[4713]: I0308 00:09:54.013773 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:55 crc kubenswrapper[4713]: I0308 00:09:55.352855 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs\") pod \"network-metrics-daemon-9klvz\" (UID: \"02de296b-0485-4f21-abf9-51043545b565\") " pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:55 crc kubenswrapper[4713]: I0308 00:09:55.354322 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 08 00:09:55 crc kubenswrapper[4713]: I0308 00:09:55.369404 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs\") pod \"network-metrics-daemon-9klvz\" (UID: \"02de296b-0485-4f21-abf9-51043545b565\") " pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:55 crc kubenswrapper[4713]: I0308 00:09:55.457748 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 08 00:09:55 crc kubenswrapper[4713]: I0308 00:09:55.466662 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:58 crc kubenswrapper[4713]: E0308 00:09:58.653272 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 08 00:09:58 crc kubenswrapper[4713]: E0308 00:09:58.653680 4713 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 00:09:58 crc kubenswrapper[4713]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 08 00:09:58 crc kubenswrapper[4713]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hrkff,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29548808-nd57l_openshift-infra(fdccd72c-79d7-4388-926e-0539c571dafe): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 08 00:09:58 crc kubenswrapper[4713]: > logger="UnhandledError" Mar 08 00:09:58 crc kubenswrapper[4713]: E0308 00:09:58.654881 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29548808-nd57l" podUID="fdccd72c-79d7-4388-926e-0539c571dafe" Mar 08 00:09:59 crc kubenswrapper[4713]: I0308 00:09:59.162450 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-57pjt"] Mar 08 00:09:59 crc kubenswrapper[4713]: I0308 00:09:59.206329 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8"] Mar 08 00:09:59 crc kubenswrapper[4713]: E0308 00:09:59.394231 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29548808-nd57l" podUID="fdccd72c-79d7-4388-926e-0539c571dafe" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.125457 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548810-lnmdz"] Mar 08 00:10:00 crc kubenswrapper[4713]: E0308 00:10:00.125701 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64aa73b3-797b-405e-b2ca-db772f204659" containerName="pruner" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.125717 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="64aa73b3-797b-405e-b2ca-db772f204659" containerName="pruner" Mar 08 00:10:00 crc kubenswrapper[4713]: E0308 00:10:00.125731 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de40fceb-b995-45d6-8272-3a93c1b85bc8" containerName="pruner" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.125737 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="de40fceb-b995-45d6-8272-3a93c1b85bc8" containerName="pruner" Mar 08 00:10:00 crc kubenswrapper[4713]: E0308 00:10:00.125752 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a04a017-1594-43d7-a796-8c676b28095e" containerName="collect-profiles" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.125758 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a04a017-1594-43d7-a796-8c676b28095e" containerName="collect-profiles" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.125899 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a04a017-1594-43d7-a796-8c676b28095e" containerName="collect-profiles" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.125916 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="de40fceb-b995-45d6-8272-3a93c1b85bc8" containerName="pruner" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.125927 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="64aa73b3-797b-405e-b2ca-db772f204659" containerName="pruner" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.126384 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548810-lnmdz" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.130723 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.136332 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548810-lnmdz"] Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.220319 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv9nh\" (UniqueName: \"kubernetes.io/projected/6470285d-4460-4c72-be17-00e880cc623d-kube-api-access-dv9nh\") pod \"auto-csr-approver-29548810-lnmdz\" (UID: \"6470285d-4460-4c72-be17-00e880cc623d\") " pod="openshift-infra/auto-csr-approver-29548810-lnmdz" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.321379 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv9nh\" (UniqueName: \"kubernetes.io/projected/6470285d-4460-4c72-be17-00e880cc623d-kube-api-access-dv9nh\") pod \"auto-csr-approver-29548810-lnmdz\" (UID: \"6470285d-4460-4c72-be17-00e880cc623d\") " pod="openshift-infra/auto-csr-approver-29548810-lnmdz" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.337433 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.338897 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv9nh\" (UniqueName: \"kubernetes.io/projected/6470285d-4460-4c72-be17-00e880cc623d-kube-api-access-dv9nh\") pod \"auto-csr-approver-29548810-lnmdz\" (UID: \"6470285d-4460-4c72-be17-00e880cc623d\") " pod="openshift-infra/auto-csr-approver-29548810-lnmdz" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.340811 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-z4s84 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.340847 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.340867 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.340873 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-z4s84 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.340906 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-z4s84" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.340921 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.341274 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"0e456590ed6aec138d6c2be36909b347ef8e66d85928a8221898c7ed939f09c4"} pod="openshift-console/downloads-7954f5f757-z4s84" containerMessage="Container download-server failed liveness probe, will be restarted" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.341314 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" containerID="cri-o://0e456590ed6aec138d6c2be36909b347ef8e66d85928a8221898c7ed939f09c4" gracePeriod=2 Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.341463 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-z4s84 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.341501 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.449180 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548810-lnmdz" Mar 08 00:10:01 crc kubenswrapper[4713]: I0308 00:10:01.397239 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8"] Mar 08 00:10:01 crc kubenswrapper[4713]: I0308 00:10:01.412194 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp"] Mar 08 00:10:02 crc kubenswrapper[4713]: I0308 00:10:02.423288 4713 generic.go:334] "Generic (PLEG): container finished" podID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerID="0e456590ed6aec138d6c2be36909b347ef8e66d85928a8221898c7ed939f09c4" exitCode=0 Mar 08 00:10:02 crc kubenswrapper[4713]: I0308 00:10:02.423335 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-z4s84" event={"ID":"62cfca3e-2ad8-4964-bd9a-5f907f09ca1e","Type":"ContainerDied","Data":"0e456590ed6aec138d6c2be36909b347ef8e66d85928a8221898c7ed939f09c4"} Mar 08 00:10:04 crc kubenswrapper[4713]: I0308 00:10:04.501323 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:10:04 crc kubenswrapper[4713]: I0308 00:10:04.501405 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:10:07 crc kubenswrapper[4713]: I0308 00:10:07.963758 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:10:10 crc kubenswrapper[4713]: I0308 00:10:10.342934 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-z4s84 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 08 00:10:10 crc kubenswrapper[4713]: I0308 00:10:10.343288 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 08 00:10:10 crc kubenswrapper[4713]: I0308 00:10:10.968213 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt" Mar 08 00:10:11 crc kubenswrapper[4713]: I0308 00:10:11.475723 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" event={"ID":"abef8d7b-3e23-43e9-96d4-3227bcc16048","Type":"ContainerStarted","Data":"f276e2b1a7d3ec5d946c0b825a48087cfddd233e9465ddce823aae24d96aed33"} Mar 08 00:10:11 crc kubenswrapper[4713]: I0308 00:10:11.478579 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-57pjt" event={"ID":"e23a30a2-2bf8-451e-b85b-b293e8949e9e","Type":"ContainerStarted","Data":"7c30588800e0dac5ab38807a23f6184382c53099e569400f6073fb7739048d46"} Mar 08 00:10:11 crc kubenswrapper[4713]: I0308 00:10:11.490024 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rdgpc"] Mar 08 00:10:11 crc kubenswrapper[4713]: I0308 00:10:11.604735 4713 ???:1] "http: TLS handshake error from 192.168.126.11:55082: no serving certificate available for the kubelet" Mar 08 00:10:11 crc kubenswrapper[4713]: I0308 00:10:11.730588 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp"] Mar 08 00:10:11 crc kubenswrapper[4713]: I0308 00:10:11.771124 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bnx6n"] Mar 08 00:10:14 crc kubenswrapper[4713]: W0308 00:10:14.030162 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcde95f7_8814_4319_8a48_6d186de5f51f.slice/crio-ef8b074d9efbef9bd1985cd1c77849aac1a6142c1203709657b5b6f697605e4e WatchSource:0}: Error finding container ef8b074d9efbef9bd1985cd1c77849aac1a6142c1203709657b5b6f697605e4e: Status 404 returned error can't find the container with id ef8b074d9efbef9bd1985cd1c77849aac1a6142c1203709657b5b6f697605e4e Mar 08 00:10:14 crc kubenswrapper[4713]: W0308 00:10:14.037278 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68a8aac8_a3d8_45c3_a4f2_6420f4740ac9.slice/crio-bb5ac4f2b836df6ac588ac8b2f666d14dde9ba8adb7944edc138fe1ed9464c9d WatchSource:0}: Error finding container bb5ac4f2b836df6ac588ac8b2f666d14dde9ba8adb7944edc138fe1ed9464c9d: Status 404 returned error can't find the container with id bb5ac4f2b836df6ac588ac8b2f666d14dde9ba8adb7944edc138fe1ed9464c9d Mar 08 00:10:14 crc kubenswrapper[4713]: I0308 00:10:14.497169 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rdgpc" event={"ID":"dcde95f7-8814-4319-8a48-6d186de5f51f","Type":"ContainerStarted","Data":"ef8b074d9efbef9bd1985cd1c77849aac1a6142c1203709657b5b6f697605e4e"} Mar 08 00:10:14 crc kubenswrapper[4713]: I0308 00:10:14.498927 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" event={"ID":"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9","Type":"ContainerStarted","Data":"bb5ac4f2b836df6ac588ac8b2f666d14dde9ba8adb7944edc138fe1ed9464c9d"} Mar 08 00:10:14 crc kubenswrapper[4713]: I0308 00:10:14.500080 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" event={"ID":"74518133-92a1-4cb0-bcb9-85ce78bb2c1f","Type":"ContainerStarted","Data":"409ade3b4669dbf5f8873e64f32cc4c3239e1b04d6422acbe8d91847c500cbde"} Mar 08 00:10:15 crc kubenswrapper[4713]: I0308 00:10:15.887666 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 08 00:10:15 crc kubenswrapper[4713]: I0308 00:10:15.888360 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 08 00:10:15 crc kubenswrapper[4713]: I0308 00:10:15.888464 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 00:10:15 crc kubenswrapper[4713]: I0308 00:10:15.890868 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 08 00:10:15 crc kubenswrapper[4713]: I0308 00:10:15.891481 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 08 00:10:15 crc kubenswrapper[4713]: I0308 00:10:15.930671 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d4ec730-3a6b-4bb3-8878-a3f458fed7a2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4d4ec730-3a6b-4bb3-8878-a3f458fed7a2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 00:10:15 crc kubenswrapper[4713]: I0308 00:10:15.931007 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d4ec730-3a6b-4bb3-8878-a3f458fed7a2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4d4ec730-3a6b-4bb3-8878-a3f458fed7a2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 00:10:16 crc kubenswrapper[4713]: I0308 00:10:16.031665 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d4ec730-3a6b-4bb3-8878-a3f458fed7a2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4d4ec730-3a6b-4bb3-8878-a3f458fed7a2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 00:10:16 crc kubenswrapper[4713]: I0308 00:10:16.031704 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d4ec730-3a6b-4bb3-8878-a3f458fed7a2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4d4ec730-3a6b-4bb3-8878-a3f458fed7a2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 00:10:16 crc kubenswrapper[4713]: I0308 00:10:16.032058 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d4ec730-3a6b-4bb3-8878-a3f458fed7a2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4d4ec730-3a6b-4bb3-8878-a3f458fed7a2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 00:10:16 crc kubenswrapper[4713]: I0308 00:10:16.067718 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d4ec730-3a6b-4bb3-8878-a3f458fed7a2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4d4ec730-3a6b-4bb3-8878-a3f458fed7a2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 00:10:16 crc kubenswrapper[4713]: I0308 00:10:16.454584 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 00:10:20 crc kubenswrapper[4713]: I0308 00:10:20.342264 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-z4s84 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 08 00:10:20 crc kubenswrapper[4713]: I0308 00:10:20.342665 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 08 00:10:21 crc kubenswrapper[4713]: I0308 00:10:21.468319 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 08 00:10:21 crc kubenswrapper[4713]: I0308 00:10:21.469147 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 08 00:10:21 crc kubenswrapper[4713]: I0308 00:10:21.475714 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 08 00:10:21 crc kubenswrapper[4713]: I0308 00:10:21.498519 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc51fa12-ec6c-48ee-8fd5-55388414d54f-kube-api-access\") pod \"installer-9-crc\" (UID: \"dc51fa12-ec6c-48ee-8fd5-55388414d54f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 00:10:21 crc kubenswrapper[4713]: I0308 00:10:21.498578 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc51fa12-ec6c-48ee-8fd5-55388414d54f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"dc51fa12-ec6c-48ee-8fd5-55388414d54f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 00:10:21 crc kubenswrapper[4713]: I0308 00:10:21.498597 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dc51fa12-ec6c-48ee-8fd5-55388414d54f-var-lock\") pod \"installer-9-crc\" (UID: \"dc51fa12-ec6c-48ee-8fd5-55388414d54f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 00:10:21 crc kubenswrapper[4713]: I0308 00:10:21.599955 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc51fa12-ec6c-48ee-8fd5-55388414d54f-kube-api-access\") pod \"installer-9-crc\" (UID: \"dc51fa12-ec6c-48ee-8fd5-55388414d54f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 00:10:21 crc kubenswrapper[4713]: I0308 00:10:21.600023 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc51fa12-ec6c-48ee-8fd5-55388414d54f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"dc51fa12-ec6c-48ee-8fd5-55388414d54f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 00:10:21 crc kubenswrapper[4713]: I0308 00:10:21.600040 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dc51fa12-ec6c-48ee-8fd5-55388414d54f-var-lock\") pod \"installer-9-crc\" (UID: \"dc51fa12-ec6c-48ee-8fd5-55388414d54f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 00:10:21 crc kubenswrapper[4713]: I0308 00:10:21.600073 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc51fa12-ec6c-48ee-8fd5-55388414d54f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"dc51fa12-ec6c-48ee-8fd5-55388414d54f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 00:10:21 crc kubenswrapper[4713]: I0308 00:10:21.600117 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dc51fa12-ec6c-48ee-8fd5-55388414d54f-var-lock\") pod \"installer-9-crc\" (UID: \"dc51fa12-ec6c-48ee-8fd5-55388414d54f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 00:10:21 crc kubenswrapper[4713]: I0308 00:10:21.617142 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc51fa12-ec6c-48ee-8fd5-55388414d54f-kube-api-access\") pod \"installer-9-crc\" (UID: \"dc51fa12-ec6c-48ee-8fd5-55388414d54f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 00:10:21 crc kubenswrapper[4713]: I0308 00:10:21.788273 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 08 00:10:26 crc kubenswrapper[4713]: E0308 00:10:26.507037 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 08 00:10:26 crc kubenswrapper[4713]: E0308 00:10:26.507712 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-prrdn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-x6gcb_openshift-marketplace(d9341928-7a63-4190-ac37-ac9ba3320e18): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 00:10:26 crc kubenswrapper[4713]: E0308 00:10:26.508967 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-x6gcb" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" Mar 08 00:10:26 crc kubenswrapper[4713]: I0308 00:10:26.614782 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548810-lnmdz"] Mar 08 00:10:26 crc kubenswrapper[4713]: I0308 00:10:26.641405 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9klvz"] Mar 08 00:10:29 crc kubenswrapper[4713]: E0308 00:10:29.163577 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 08 00:10:29 crc kubenswrapper[4713]: E0308 00:10:29.163855 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m8fx2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-4tj99_openshift-marketplace(40864d72-e137-478e-8340-8c0f107b4c60): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 00:10:29 crc kubenswrapper[4713]: E0308 00:10:29.165043 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-4tj99" podUID="40864d72-e137-478e-8340-8c0f107b4c60" Mar 08 00:10:29 crc kubenswrapper[4713]: E0308 00:10:29.638998 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 08 00:10:29 crc kubenswrapper[4713]: E0308 00:10:29.639466 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7bjqb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-x7pkf_openshift-marketplace(c33b42a1-bf95-490f-a907-765855ec81d1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 00:10:29 crc kubenswrapper[4713]: E0308 00:10:29.640676 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-x7pkf" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" Mar 08 00:10:29 crc kubenswrapper[4713]: E0308 00:10:29.759029 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 08 00:10:29 crc kubenswrapper[4713]: E0308 00:10:29.759181 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9t4bc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-pd9br_openshift-marketplace(cd4a956b-6edb-436e-bd5e-5d57899c2ea1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 00:10:29 crc kubenswrapper[4713]: E0308 00:10:29.760853 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-pd9br" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" Mar 08 00:10:29 crc kubenswrapper[4713]: E0308 00:10:29.915048 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-x6gcb" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" Mar 08 00:10:29 crc kubenswrapper[4713]: E0308 00:10:29.915079 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-4tj99" podUID="40864d72-e137-478e-8340-8c0f107b4c60" Mar 08 00:10:29 crc kubenswrapper[4713]: W0308 00:10:29.935097 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6470285d_4460_4c72_be17_00e880cc623d.slice/crio-4c7523e0406dedf70f87c204d810a583910f394e92876f1ad63424e8210147d2 WatchSource:0}: Error finding container 4c7523e0406dedf70f87c204d810a583910f394e92876f1ad63424e8210147d2: Status 404 returned error can't find the container with id 4c7523e0406dedf70f87c204d810a583910f394e92876f1ad63424e8210147d2 Mar 08 00:10:29 crc kubenswrapper[4713]: W0308 00:10:29.936996 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02de296b_0485_4f21_abf9_51043545b565.slice/crio-a7b0c5b6adeebc1845913460990aee0d46019724eaa06db5f2781d6636cb5ccf WatchSource:0}: Error finding container a7b0c5b6adeebc1845913460990aee0d46019724eaa06db5f2781d6636cb5ccf: Status 404 returned error can't find the container with id a7b0c5b6adeebc1845913460990aee0d46019724eaa06db5f2781d6636cb5ccf Mar 08 00:10:30 crc kubenswrapper[4713]: E0308 00:10:30.088324 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 08 00:10:30 crc kubenswrapper[4713]: E0308 00:10:30.088864 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sxjck,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-hs88q_openshift-marketplace(2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 00:10:30 crc kubenswrapper[4713]: E0308 00:10:30.090098 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-hs88q" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" Mar 08 00:10:30 crc kubenswrapper[4713]: I0308 00:10:30.341336 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-z4s84 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 08 00:10:30 crc kubenswrapper[4713]: I0308 00:10:30.341405 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 08 00:10:30 crc kubenswrapper[4713]: I0308 00:10:30.575193 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548810-lnmdz" event={"ID":"6470285d-4460-4c72-be17-00e880cc623d","Type":"ContainerStarted","Data":"4c7523e0406dedf70f87c204d810a583910f394e92876f1ad63424e8210147d2"} Mar 08 00:10:30 crc kubenswrapper[4713]: I0308 00:10:30.576793 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9klvz" event={"ID":"02de296b-0485-4f21-abf9-51043545b565","Type":"ContainerStarted","Data":"a7b0c5b6adeebc1845913460990aee0d46019724eaa06db5f2781d6636cb5ccf"} Mar 08 00:10:30 crc kubenswrapper[4713]: I0308 00:10:30.805567 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 08 00:10:30 crc kubenswrapper[4713]: I0308 00:10:30.850631 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 08 00:10:30 crc kubenswrapper[4713]: E0308 00:10:30.973919 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-hs88q" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" Mar 08 00:10:30 crc kubenswrapper[4713]: E0308 00:10:30.974165 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-pd9br" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" Mar 08 00:10:30 crc kubenswrapper[4713]: E0308 00:10:30.974199 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-x7pkf" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" Mar 08 00:10:30 crc kubenswrapper[4713]: W0308 00:10:30.992645 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4d4ec730_3a6b_4bb3_8878_a3f458fed7a2.slice/crio-e6e2e6a429ef142ccdb208a757e7c7f167926e39716a378a021d8f8203cc62e7 WatchSource:0}: Error finding container e6e2e6a429ef142ccdb208a757e7c7f167926e39716a378a021d8f8203cc62e7: Status 404 returned error can't find the container with id e6e2e6a429ef142ccdb208a757e7c7f167926e39716a378a021d8f8203cc62e7 Mar 08 00:10:30 crc kubenswrapper[4713]: W0308 00:10:30.997403 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddc51fa12_ec6c_48ee_8fd5_55388414d54f.slice/crio-d9172293da02dd75281be2c0f6a68b321d4fe6ee21fc35d92d3715acf36901df WatchSource:0}: Error finding container d9172293da02dd75281be2c0f6a68b321d4fe6ee21fc35d92d3715acf36901df: Status 404 returned error can't find the container with id d9172293da02dd75281be2c0f6a68b321d4fe6ee21fc35d92d3715acf36901df Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.644243 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-z4s84" event={"ID":"62cfca3e-2ad8-4964-bd9a-5f907f09ca1e","Type":"ContainerStarted","Data":"6c825c4961943cf83a347e73d9455b846b95d6105e56a08a5541dea0e250734c"} Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.644563 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-z4s84" Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.644959 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-z4s84 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.645015 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.646276 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" event={"ID":"74518133-92a1-4cb0-bcb9-85ce78bb2c1f","Type":"ContainerStarted","Data":"2be52eb7ab64193d806b01369127f24c2cc2e879c0591f39fd96e17a48caa66e"} Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.646297 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" podUID="74518133-92a1-4cb0-bcb9-85ce78bb2c1f" containerName="route-controller-manager" containerID="cri-o://2be52eb7ab64193d806b01369127f24c2cc2e879c0591f39fd96e17a48caa66e" gracePeriod=30 Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.646406 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.649088 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" event={"ID":"abef8d7b-3e23-43e9-96d4-3227bcc16048","Type":"ContainerStarted","Data":"0d91eba4ab098027c3901709dbaf3407c532160b9470b4400dd143735aa2d338"} Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.649174 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" podUID="abef8d7b-3e23-43e9-96d4-3227bcc16048" containerName="controller-manager" containerID="cri-o://0d91eba4ab098027c3901709dbaf3407c532160b9470b4400dd143735aa2d338" gracePeriod=30 Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.649242 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.652994 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4d4ec730-3a6b-4bb3-8878-a3f458fed7a2","Type":"ContainerStarted","Data":"e6e2e6a429ef142ccdb208a757e7c7f167926e39716a378a021d8f8203cc62e7"} Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.654622 4713 generic.go:334] "Generic (PLEG): container finished" podID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" containerID="99ba221bc55466be0084d80442d6dec86c90deadbc054c19ec89fd1d01900208" exitCode=0 Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.654691 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-57pjt" event={"ID":"e23a30a2-2bf8-451e-b85b-b293e8949e9e","Type":"ContainerDied","Data":"99ba221bc55466be0084d80442d6dec86c90deadbc054c19ec89fd1d01900208"} Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.655644 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"dc51fa12-ec6c-48ee-8fd5-55388414d54f","Type":"ContainerStarted","Data":"d9172293da02dd75281be2c0f6a68b321d4fe6ee21fc35d92d3715acf36901df"} Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.658440 4713 generic.go:334] "Generic (PLEG): container finished" podID="dcde95f7-8814-4319-8a48-6d186de5f51f" containerID="eb31791b33621b563ffdcd2c2e41bd769a0b407d0d7cbd536956a89ac412d5bb" exitCode=0 Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.658527 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rdgpc" event={"ID":"dcde95f7-8814-4319-8a48-6d186de5f51f","Type":"ContainerDied","Data":"eb31791b33621b563ffdcd2c2e41bd769a0b407d0d7cbd536956a89ac412d5bb"} Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.661462 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" event={"ID":"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9","Type":"ContainerStarted","Data":"93cc0fcd69abc860cf55312dc82c20ddffc56cc57377b335880d3a97133a4aff"} Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.685931 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.701716 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" podStartSLOduration=49.701696821 podStartE2EDuration="49.701696821s" podCreationTimestamp="2026-03-08 00:09:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:10:31.697966483 +0000 UTC m=+285.817598716" watchObservedRunningTime="2026-03-08 00:10:31.701696821 +0000 UTC m=+285.821329064" Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.729764 4713 patch_prober.go:28] interesting pod/route-controller-manager-857fc9cd49-86dkp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": read tcp 10.217.0.2:32974->10.217.0.57:8443: read: connection reset by peer" start-of-body= Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.729806 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" podUID="74518133-92a1-4cb0-bcb9-85ce78bb2c1f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": read tcp 10.217.0.2:32974->10.217.0.57:8443: read: connection reset by peer" Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.730043 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" podStartSLOduration=49.730034949 podStartE2EDuration="49.730034949s" podCreationTimestamp="2026-03-08 00:09:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:10:31.714299884 +0000 UTC m=+285.833932117" watchObservedRunningTime="2026-03-08 00:10:31.730034949 +0000 UTC m=+285.849667182" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.153008 4713 csr.go:261] certificate signing request csr-8g47m is approved, waiting to be issued Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.160490 4713 csr.go:257] certificate signing request csr-8g47m is issued Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.565265 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-857fc9cd49-86dkp_74518133-92a1-4cb0-bcb9-85ce78bb2c1f/route-controller-manager/0.log" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.565868 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.570682 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.602860 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl"] Mar 08 00:10:32 crc kubenswrapper[4713]: E0308 00:10:32.609526 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abef8d7b-3e23-43e9-96d4-3227bcc16048" containerName="controller-manager" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.609632 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="abef8d7b-3e23-43e9-96d4-3227bcc16048" containerName="controller-manager" Mar 08 00:10:32 crc kubenswrapper[4713]: E0308 00:10:32.609651 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74518133-92a1-4cb0-bcb9-85ce78bb2c1f" containerName="route-controller-manager" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.609664 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="74518133-92a1-4cb0-bcb9-85ce78bb2c1f" containerName="route-controller-manager" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.610518 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="74518133-92a1-4cb0-bcb9-85ce78bb2c1f" containerName="route-controller-manager" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.610539 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="abef8d7b-3e23-43e9-96d4-3227bcc16048" containerName="controller-manager" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.611709 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.639343 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl"] Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.658688 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-serving-cert\") pod \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\" (UID: \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\") " Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.658779 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-client-ca\") pod \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\" (UID: \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\") " Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.658812 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jg8b\" (UniqueName: \"kubernetes.io/projected/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-kube-api-access-5jg8b\") pod \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\" (UID: \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\") " Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.658898 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-config\") pod \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\" (UID: \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\") " Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.660309 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-client-ca" (OuterVolumeSpecName: "client-ca") pod "74518133-92a1-4cb0-bcb9-85ce78bb2c1f" (UID: "74518133-92a1-4cb0-bcb9-85ce78bb2c1f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.660427 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-config" (OuterVolumeSpecName: "config") pod "74518133-92a1-4cb0-bcb9-85ce78bb2c1f" (UID: "74518133-92a1-4cb0-bcb9-85ce78bb2c1f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.664879 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-kube-api-access-5jg8b" (OuterVolumeSpecName: "kube-api-access-5jg8b") pod "74518133-92a1-4cb0-bcb9-85ce78bb2c1f" (UID: "74518133-92a1-4cb0-bcb9-85ce78bb2c1f"). InnerVolumeSpecName "kube-api-access-5jg8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.664971 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "74518133-92a1-4cb0-bcb9-85ce78bb2c1f" (UID: "74518133-92a1-4cb0-bcb9-85ce78bb2c1f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.666700 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"dc51fa12-ec6c-48ee-8fd5-55388414d54f","Type":"ContainerStarted","Data":"b5c6644f13e27288f2154b86d0cb3a5c886ae340b696eaaa05f0b93b6be6c6d6"} Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.668574 4713 generic.go:334] "Generic (PLEG): container finished" podID="822fdb72-7e7f-441b-8ebc-178ef46cca73" containerID="524dfa3729d8726beb09ae412f7321389ba47ef0624fa7d2798a1f20145b2133" exitCode=0 Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.668633 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5hssk" event={"ID":"822fdb72-7e7f-441b-8ebc-178ef46cca73","Type":"ContainerDied","Data":"524dfa3729d8726beb09ae412f7321389ba47ef0624fa7d2798a1f20145b2133"} Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.669873 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-857fc9cd49-86dkp_74518133-92a1-4cb0-bcb9-85ce78bb2c1f/route-controller-manager/0.log" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.669901 4713 generic.go:334] "Generic (PLEG): container finished" podID="74518133-92a1-4cb0-bcb9-85ce78bb2c1f" containerID="2be52eb7ab64193d806b01369127f24c2cc2e879c0591f39fd96e17a48caa66e" exitCode=255 Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.669965 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.670037 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" event={"ID":"74518133-92a1-4cb0-bcb9-85ce78bb2c1f","Type":"ContainerDied","Data":"2be52eb7ab64193d806b01369127f24c2cc2e879c0591f39fd96e17a48caa66e"} Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.670051 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" event={"ID":"74518133-92a1-4cb0-bcb9-85ce78bb2c1f","Type":"ContainerDied","Data":"409ade3b4669dbf5f8873e64f32cc4c3239e1b04d6422acbe8d91847c500cbde"} Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.670065 4713 scope.go:117] "RemoveContainer" containerID="2be52eb7ab64193d806b01369127f24c2cc2e879c0591f39fd96e17a48caa66e" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.673450 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9klvz" event={"ID":"02de296b-0485-4f21-abf9-51043545b565","Type":"ContainerStarted","Data":"8d66e38ca3acbd10e7fd1bbbfa3f7735eac5a6a0db2471c93d80fc8e73e19ae2"} Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.673512 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9klvz" event={"ID":"02de296b-0485-4f21-abf9-51043545b565","Type":"ContainerStarted","Data":"6174fac062b15063d6f4a7cb7e5e9cc9fcde6c4007b95d3fe1884f1c0485c85d"} Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.675871 4713 generic.go:334] "Generic (PLEG): container finished" podID="fdccd72c-79d7-4388-926e-0539c571dafe" containerID="11992517ed2080bab72a9aa961669962e2daffa5f367346a3dc9ef9010cbb913" exitCode=0 Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.676000 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548808-nd57l" event={"ID":"fdccd72c-79d7-4388-926e-0539c571dafe","Type":"ContainerDied","Data":"11992517ed2080bab72a9aa961669962e2daffa5f367346a3dc9ef9010cbb913"} Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.677800 4713 generic.go:334] "Generic (PLEG): container finished" podID="abef8d7b-3e23-43e9-96d4-3227bcc16048" containerID="0d91eba4ab098027c3901709dbaf3407c532160b9470b4400dd143735aa2d338" exitCode=0 Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.677886 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.677892 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" event={"ID":"abef8d7b-3e23-43e9-96d4-3227bcc16048","Type":"ContainerDied","Data":"0d91eba4ab098027c3901709dbaf3407c532160b9470b4400dd143735aa2d338"} Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.679637 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" event={"ID":"abef8d7b-3e23-43e9-96d4-3227bcc16048","Type":"ContainerDied","Data":"f276e2b1a7d3ec5d946c0b825a48087cfddd233e9465ddce823aae24d96aed33"} Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.685797 4713 generic.go:334] "Generic (PLEG): container finished" podID="4d4ec730-3a6b-4bb3-8878-a3f458fed7a2" containerID="3d57ce672ca7a4417b25b823232a1b0087d96c80347a2c4c027d8db9eed30aa7" exitCode=0 Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.686311 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4d4ec730-3a6b-4bb3-8878-a3f458fed7a2","Type":"ContainerDied","Data":"3d57ce672ca7a4417b25b823232a1b0087d96c80347a2c4c027d8db9eed30aa7"} Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.686763 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.686778 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-z4s84 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.686807 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.688595 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=11.688581918 podStartE2EDuration="11.688581918s" podCreationTimestamp="2026-03-08 00:10:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:10:32.685050665 +0000 UTC m=+286.804682898" watchObservedRunningTime="2026-03-08 00:10:32.688581918 +0000 UTC m=+286.808214161" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.705913 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp"] Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.708190 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp"] Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.754026 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" podStartSLOduration=214.754008405 podStartE2EDuration="3m34.754008405s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:10:32.750557014 +0000 UTC m=+286.870189277" watchObservedRunningTime="2026-03-08 00:10:32.754008405 +0000 UTC m=+286.873640638" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.760023 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-config\") pod \"abef8d7b-3e23-43e9-96d4-3227bcc16048\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.760087 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5w5j\" (UniqueName: \"kubernetes.io/projected/abef8d7b-3e23-43e9-96d4-3227bcc16048-kube-api-access-g5w5j\") pod \"abef8d7b-3e23-43e9-96d4-3227bcc16048\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.760142 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-client-ca\") pod \"abef8d7b-3e23-43e9-96d4-3227bcc16048\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.760220 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abef8d7b-3e23-43e9-96d4-3227bcc16048-serving-cert\") pod \"abef8d7b-3e23-43e9-96d4-3227bcc16048\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.760279 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-proxy-ca-bundles\") pod \"abef8d7b-3e23-43e9-96d4-3227bcc16048\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.760505 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjt7x\" (UniqueName: \"kubernetes.io/projected/7daca87e-5103-46bd-b6ae-7643c66a4fbc-kube-api-access-zjt7x\") pod \"route-controller-manager-6b94cbf9d6-j2rxl\" (UID: \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\") " pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.760560 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7daca87e-5103-46bd-b6ae-7643c66a4fbc-client-ca\") pod \"route-controller-manager-6b94cbf9d6-j2rxl\" (UID: \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\") " pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.761210 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7daca87e-5103-46bd-b6ae-7643c66a4fbc-serving-cert\") pod \"route-controller-manager-6b94cbf9d6-j2rxl\" (UID: \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\") " pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.761272 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7daca87e-5103-46bd-b6ae-7643c66a4fbc-config\") pod \"route-controller-manager-6b94cbf9d6-j2rxl\" (UID: \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\") " pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.761386 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.761620 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.761650 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jg8b\" (UniqueName: \"kubernetes.io/projected/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-kube-api-access-5jg8b\") on node \"crc\" DevicePath \"\"" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.761664 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.762087 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "abef8d7b-3e23-43e9-96d4-3227bcc16048" (UID: "abef8d7b-3e23-43e9-96d4-3227bcc16048"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.762216 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-client-ca" (OuterVolumeSpecName: "client-ca") pod "abef8d7b-3e23-43e9-96d4-3227bcc16048" (UID: "abef8d7b-3e23-43e9-96d4-3227bcc16048"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.762619 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-config" (OuterVolumeSpecName: "config") pod "abef8d7b-3e23-43e9-96d4-3227bcc16048" (UID: "abef8d7b-3e23-43e9-96d4-3227bcc16048"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.763030 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abef8d7b-3e23-43e9-96d4-3227bcc16048-kube-api-access-g5w5j" (OuterVolumeSpecName: "kube-api-access-g5w5j") pod "abef8d7b-3e23-43e9-96d4-3227bcc16048" (UID: "abef8d7b-3e23-43e9-96d4-3227bcc16048"). InnerVolumeSpecName "kube-api-access-g5w5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.763061 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abef8d7b-3e23-43e9-96d4-3227bcc16048-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "abef8d7b-3e23-43e9-96d4-3227bcc16048" (UID: "abef8d7b-3e23-43e9-96d4-3227bcc16048"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.862585 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjt7x\" (UniqueName: \"kubernetes.io/projected/7daca87e-5103-46bd-b6ae-7643c66a4fbc-kube-api-access-zjt7x\") pod \"route-controller-manager-6b94cbf9d6-j2rxl\" (UID: \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\") " pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.862880 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7daca87e-5103-46bd-b6ae-7643c66a4fbc-client-ca\") pod \"route-controller-manager-6b94cbf9d6-j2rxl\" (UID: \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\") " pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.862949 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7daca87e-5103-46bd-b6ae-7643c66a4fbc-serving-cert\") pod \"route-controller-manager-6b94cbf9d6-j2rxl\" (UID: \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\") " pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.862972 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7daca87e-5103-46bd-b6ae-7643c66a4fbc-config\") pod \"route-controller-manager-6b94cbf9d6-j2rxl\" (UID: \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\") " pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.863021 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5w5j\" (UniqueName: \"kubernetes.io/projected/abef8d7b-3e23-43e9-96d4-3227bcc16048-kube-api-access-g5w5j\") on node \"crc\" DevicePath \"\"" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.863031 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.863039 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abef8d7b-3e23-43e9-96d4-3227bcc16048-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.863048 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.863056 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.864069 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7daca87e-5103-46bd-b6ae-7643c66a4fbc-client-ca\") pod \"route-controller-manager-6b94cbf9d6-j2rxl\" (UID: \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\") " pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.864372 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7daca87e-5103-46bd-b6ae-7643c66a4fbc-config\") pod \"route-controller-manager-6b94cbf9d6-j2rxl\" (UID: \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\") " pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.866240 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7daca87e-5103-46bd-b6ae-7643c66a4fbc-serving-cert\") pod \"route-controller-manager-6b94cbf9d6-j2rxl\" (UID: \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\") " pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.884562 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjt7x\" (UniqueName: \"kubernetes.io/projected/7daca87e-5103-46bd-b6ae-7643c66a4fbc-kube-api-access-zjt7x\") pod \"route-controller-manager-6b94cbf9d6-j2rxl\" (UID: \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\") " pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.886793 4713 scope.go:117] "RemoveContainer" containerID="2be52eb7ab64193d806b01369127f24c2cc2e879c0591f39fd96e17a48caa66e" Mar 08 00:10:32 crc kubenswrapper[4713]: E0308 00:10:32.889088 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2be52eb7ab64193d806b01369127f24c2cc2e879c0591f39fd96e17a48caa66e\": container with ID starting with 2be52eb7ab64193d806b01369127f24c2cc2e879c0591f39fd96e17a48caa66e not found: ID does not exist" containerID="2be52eb7ab64193d806b01369127f24c2cc2e879c0591f39fd96e17a48caa66e" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.889135 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2be52eb7ab64193d806b01369127f24c2cc2e879c0591f39fd96e17a48caa66e"} err="failed to get container status \"2be52eb7ab64193d806b01369127f24c2cc2e879c0591f39fd96e17a48caa66e\": rpc error: code = NotFound desc = could not find container \"2be52eb7ab64193d806b01369127f24c2cc2e879c0591f39fd96e17a48caa66e\": container with ID starting with 2be52eb7ab64193d806b01369127f24c2cc2e879c0591f39fd96e17a48caa66e not found: ID does not exist" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.889167 4713 scope.go:117] "RemoveContainer" containerID="0d91eba4ab098027c3901709dbaf3407c532160b9470b4400dd143735aa2d338" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.933964 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:10:33 crc kubenswrapper[4713]: I0308 00:10:33.025288 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8"] Mar 08 00:10:33 crc kubenswrapper[4713]: I0308 00:10:33.025344 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8"] Mar 08 00:10:33 crc kubenswrapper[4713]: I0308 00:10:33.081677 4713 scope.go:117] "RemoveContainer" containerID="0d91eba4ab098027c3901709dbaf3407c532160b9470b4400dd143735aa2d338" Mar 08 00:10:33 crc kubenswrapper[4713]: E0308 00:10:33.082225 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d91eba4ab098027c3901709dbaf3407c532160b9470b4400dd143735aa2d338\": container with ID starting with 0d91eba4ab098027c3901709dbaf3407c532160b9470b4400dd143735aa2d338 not found: ID does not exist" containerID="0d91eba4ab098027c3901709dbaf3407c532160b9470b4400dd143735aa2d338" Mar 08 00:10:33 crc kubenswrapper[4713]: I0308 00:10:33.082265 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d91eba4ab098027c3901709dbaf3407c532160b9470b4400dd143735aa2d338"} err="failed to get container status \"0d91eba4ab098027c3901709dbaf3407c532160b9470b4400dd143735aa2d338\": rpc error: code = NotFound desc = could not find container \"0d91eba4ab098027c3901709dbaf3407c532160b9470b4400dd143735aa2d338\": container with ID starting with 0d91eba4ab098027c3901709dbaf3407c532160b9470b4400dd143735aa2d338 not found: ID does not exist" Mar 08 00:10:33 crc kubenswrapper[4713]: I0308 00:10:33.162993 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-02 13:16:59.715685512 +0000 UTC Mar 08 00:10:33 crc kubenswrapper[4713]: I0308 00:10:33.163035 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6469h6m26.552653422s for next certificate rotation Mar 08 00:10:33 crc kubenswrapper[4713]: I0308 00:10:33.919145 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-9klvz" podStartSLOduration=215.919122045 podStartE2EDuration="3m35.919122045s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:10:33.712159873 +0000 UTC m=+287.831792126" watchObservedRunningTime="2026-03-08 00:10:33.919122045 +0000 UTC m=+288.038754278" Mar 08 00:10:33 crc kubenswrapper[4713]: I0308 00:10:33.923492 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl"] Mar 08 00:10:33 crc kubenswrapper[4713]: W0308 00:10:33.942082 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7daca87e_5103_46bd_b6ae_7643c66a4fbc.slice/crio-ca8e90ef695a32802124e9aceef3123bdb89dbe43217f030e702dfd71adfbdc7 WatchSource:0}: Error finding container ca8e90ef695a32802124e9aceef3123bdb89dbe43217f030e702dfd71adfbdc7: Status 404 returned error can't find the container with id ca8e90ef695a32802124e9aceef3123bdb89dbe43217f030e702dfd71adfbdc7 Mar 08 00:10:33 crc kubenswrapper[4713]: I0308 00:10:33.945889 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 00:10:33 crc kubenswrapper[4713]: I0308 00:10:33.973986 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548808-nd57l" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.111013 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrkff\" (UniqueName: \"kubernetes.io/projected/fdccd72c-79d7-4388-926e-0539c571dafe-kube-api-access-hrkff\") pod \"fdccd72c-79d7-4388-926e-0539c571dafe\" (UID: \"fdccd72c-79d7-4388-926e-0539c571dafe\") " Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.111129 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d4ec730-3a6b-4bb3-8878-a3f458fed7a2-kube-api-access\") pod \"4d4ec730-3a6b-4bb3-8878-a3f458fed7a2\" (UID: \"4d4ec730-3a6b-4bb3-8878-a3f458fed7a2\") " Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.111200 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d4ec730-3a6b-4bb3-8878-a3f458fed7a2-kubelet-dir\") pod \"4d4ec730-3a6b-4bb3-8878-a3f458fed7a2\" (UID: \"4d4ec730-3a6b-4bb3-8878-a3f458fed7a2\") " Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.111323 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4d4ec730-3a6b-4bb3-8878-a3f458fed7a2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4d4ec730-3a6b-4bb3-8878-a3f458fed7a2" (UID: "4d4ec730-3a6b-4bb3-8878-a3f458fed7a2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.111542 4713 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d4ec730-3a6b-4bb3-8878-a3f458fed7a2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.116368 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdccd72c-79d7-4388-926e-0539c571dafe-kube-api-access-hrkff" (OuterVolumeSpecName: "kube-api-access-hrkff") pod "fdccd72c-79d7-4388-926e-0539c571dafe" (UID: "fdccd72c-79d7-4388-926e-0539c571dafe"). InnerVolumeSpecName "kube-api-access-hrkff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.116417 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d4ec730-3a6b-4bb3-8878-a3f458fed7a2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4d4ec730-3a6b-4bb3-8878-a3f458fed7a2" (UID: "4d4ec730-3a6b-4bb3-8878-a3f458fed7a2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.212850 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrkff\" (UniqueName: \"kubernetes.io/projected/fdccd72c-79d7-4388-926e-0539c571dafe-kube-api-access-hrkff\") on node \"crc\" DevicePath \"\"" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.213182 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d4ec730-3a6b-4bb3-8878-a3f458fed7a2-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.502303 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.502359 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.502399 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.502881 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd"} pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.502940 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" containerID="cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd" gracePeriod=600 Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.570700 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74518133-92a1-4cb0-bcb9-85ce78bb2c1f" path="/var/lib/kubelet/pods/74518133-92a1-4cb0-bcb9-85ce78bb2c1f/volumes" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.571595 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abef8d7b-3e23-43e9-96d4-3227bcc16048" path="/var/lib/kubelet/pods/abef8d7b-3e23-43e9-96d4-3227bcc16048/volumes" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.702607 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4d4ec730-3a6b-4bb3-8878-a3f458fed7a2","Type":"ContainerDied","Data":"e6e2e6a429ef142ccdb208a757e7c7f167926e39716a378a021d8f8203cc62e7"} Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.702646 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6e2e6a429ef142ccdb208a757e7c7f167926e39716a378a021d8f8203cc62e7" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.702722 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.706763 4713 generic.go:334] "Generic (PLEG): container finished" podID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerID="ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd" exitCode=0 Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.706865 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerDied","Data":"ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd"} Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.710089 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5hssk" event={"ID":"822fdb72-7e7f-441b-8ebc-178ef46cca73","Type":"ContainerStarted","Data":"4cfc44af3acab9f9da37265b5df0c44c4ce8481c6b73a6a1c6911e1394713817"} Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.712258 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548808-nd57l" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.712674 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548808-nd57l" event={"ID":"fdccd72c-79d7-4388-926e-0539c571dafe","Type":"ContainerDied","Data":"0af707d82a061d622eec317592ad4179a6046c0ac5a6b6a6071ecbfdd53ddeaa"} Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.712718 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0af707d82a061d622eec317592ad4179a6046c0ac5a6b6a6071ecbfdd53ddeaa" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.715026 4713 generic.go:334] "Generic (PLEG): container finished" podID="6470285d-4460-4c72-be17-00e880cc623d" containerID="1cac5b889750a3972edc99367bdaaf3ef41e15813fd86b31ba34d9a937e3a2a1" exitCode=0 Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.715083 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548810-lnmdz" event={"ID":"6470285d-4460-4c72-be17-00e880cc623d","Type":"ContainerDied","Data":"1cac5b889750a3972edc99367bdaaf3ef41e15813fd86b31ba34d9a937e3a2a1"} Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.718230 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" event={"ID":"7daca87e-5103-46bd-b6ae-7643c66a4fbc","Type":"ContainerStarted","Data":"60b716d027634d1d9bfd56752b1e12c7b7eb837d727fb4d3708bc8b18f7698a3"} Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.718284 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" event={"ID":"7daca87e-5103-46bd-b6ae-7643c66a4fbc","Type":"ContainerStarted","Data":"ca8e90ef695a32802124e9aceef3123bdb89dbe43217f030e702dfd71adfbdc7"} Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.718495 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.731697 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.739309 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5hssk" podStartSLOduration=7.277833377 podStartE2EDuration="52.739289061s" podCreationTimestamp="2026-03-08 00:09:42 +0000 UTC" firstStartedPulling="2026-03-08 00:09:48.810052202 +0000 UTC m=+242.929684435" lastFinishedPulling="2026-03-08 00:10:34.271507886 +0000 UTC m=+288.391140119" observedRunningTime="2026-03-08 00:10:34.737472543 +0000 UTC m=+288.857104786" watchObservedRunningTime="2026-03-08 00:10:34.739289061 +0000 UTC m=+288.858921294" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.771612 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" podStartSLOduration=13.771593124 podStartE2EDuration="13.771593124s" podCreationTimestamp="2026-03-08 00:10:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:10:34.770721431 +0000 UTC m=+288.890353674" watchObservedRunningTime="2026-03-08 00:10:34.771593124 +0000 UTC m=+288.891225357" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.352331 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-b59c8fc9c-nklnq"] Mar 08 00:10:35 crc kubenswrapper[4713]: E0308 00:10:35.352901 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d4ec730-3a6b-4bb3-8878-a3f458fed7a2" containerName="pruner" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.352915 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d4ec730-3a6b-4bb3-8878-a3f458fed7a2" containerName="pruner" Mar 08 00:10:35 crc kubenswrapper[4713]: E0308 00:10:35.352928 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdccd72c-79d7-4388-926e-0539c571dafe" containerName="oc" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.352938 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdccd72c-79d7-4388-926e-0539c571dafe" containerName="oc" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.353054 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d4ec730-3a6b-4bb3-8878-a3f458fed7a2" containerName="pruner" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.353068 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdccd72c-79d7-4388-926e-0539c571dafe" containerName="oc" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.353477 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.355205 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.355532 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.356620 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.361173 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.361366 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.363539 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.364535 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.366766 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b59c8fc9c-nklnq"] Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.435736 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-proxy-ca-bundles\") pod \"controller-manager-b59c8fc9c-nklnq\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.435911 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48rlp\" (UniqueName: \"kubernetes.io/projected/58583d53-0add-4758-8d8b-c309a79b4c48-kube-api-access-48rlp\") pod \"controller-manager-b59c8fc9c-nklnq\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.435967 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-config\") pod \"controller-manager-b59c8fc9c-nklnq\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.435993 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-client-ca\") pod \"controller-manager-b59c8fc9c-nklnq\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.436011 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58583d53-0add-4758-8d8b-c309a79b4c48-serving-cert\") pod \"controller-manager-b59c8fc9c-nklnq\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.536933 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48rlp\" (UniqueName: \"kubernetes.io/projected/58583d53-0add-4758-8d8b-c309a79b4c48-kube-api-access-48rlp\") pod \"controller-manager-b59c8fc9c-nklnq\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.536992 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-config\") pod \"controller-manager-b59c8fc9c-nklnq\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.537015 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-client-ca\") pod \"controller-manager-b59c8fc9c-nklnq\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.537033 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58583d53-0add-4758-8d8b-c309a79b4c48-serving-cert\") pod \"controller-manager-b59c8fc9c-nklnq\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.537058 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-proxy-ca-bundles\") pod \"controller-manager-b59c8fc9c-nklnq\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.538518 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-proxy-ca-bundles\") pod \"controller-manager-b59c8fc9c-nklnq\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.538527 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-client-ca\") pod \"controller-manager-b59c8fc9c-nklnq\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.539040 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-config\") pod \"controller-manager-b59c8fc9c-nklnq\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.547115 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58583d53-0add-4758-8d8b-c309a79b4c48-serving-cert\") pod \"controller-manager-b59c8fc9c-nklnq\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.552758 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48rlp\" (UniqueName: \"kubernetes.io/projected/58583d53-0add-4758-8d8b-c309a79b4c48-kube-api-access-48rlp\") pod \"controller-manager-b59c8fc9c-nklnq\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.678510 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.911384 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b59c8fc9c-nklnq"] Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.923430 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548810-lnmdz" Mar 08 00:10:35 crc kubenswrapper[4713]: W0308 00:10:35.925528 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58583d53_0add_4758_8d8b_c309a79b4c48.slice/crio-bf14b4768a06207e44a9e2b8f817f874dac0b317715a2c1cef7640a7a7b1ee98 WatchSource:0}: Error finding container bf14b4768a06207e44a9e2b8f817f874dac0b317715a2c1cef7640a7a7b1ee98: Status 404 returned error can't find the container with id bf14b4768a06207e44a9e2b8f817f874dac0b317715a2c1cef7640a7a7b1ee98 Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.942392 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv9nh\" (UniqueName: \"kubernetes.io/projected/6470285d-4460-4c72-be17-00e880cc623d-kube-api-access-dv9nh\") pod \"6470285d-4460-4c72-be17-00e880cc623d\" (UID: \"6470285d-4460-4c72-be17-00e880cc623d\") " Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.950606 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6470285d-4460-4c72-be17-00e880cc623d-kube-api-access-dv9nh" (OuterVolumeSpecName: "kube-api-access-dv9nh") pod "6470285d-4460-4c72-be17-00e880cc623d" (UID: "6470285d-4460-4c72-be17-00e880cc623d"). InnerVolumeSpecName "kube-api-access-dv9nh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:10:36 crc kubenswrapper[4713]: I0308 00:10:36.043978 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dv9nh\" (UniqueName: \"kubernetes.io/projected/6470285d-4460-4c72-be17-00e880cc623d-kube-api-access-dv9nh\") on node \"crc\" DevicePath \"\"" Mar 08 00:10:36 crc kubenswrapper[4713]: I0308 00:10:36.731365 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548810-lnmdz" event={"ID":"6470285d-4460-4c72-be17-00e880cc623d","Type":"ContainerDied","Data":"4c7523e0406dedf70f87c204d810a583910f394e92876f1ad63424e8210147d2"} Mar 08 00:10:36 crc kubenswrapper[4713]: I0308 00:10:36.731401 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548810-lnmdz" Mar 08 00:10:36 crc kubenswrapper[4713]: I0308 00:10:36.731415 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c7523e0406dedf70f87c204d810a583910f394e92876f1ad63424e8210147d2" Mar 08 00:10:36 crc kubenswrapper[4713]: I0308 00:10:36.736995 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerStarted","Data":"01a3ae60af94ae8d21eb3d737224225b18f319c8b266fff21272171a73177224"} Mar 08 00:10:36 crc kubenswrapper[4713]: I0308 00:10:36.739122 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" event={"ID":"58583d53-0add-4758-8d8b-c309a79b4c48","Type":"ContainerStarted","Data":"bf14b4768a06207e44a9e2b8f817f874dac0b317715a2c1cef7640a7a7b1ee98"} Mar 08 00:10:37 crc kubenswrapper[4713]: I0308 00:10:37.745435 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" event={"ID":"58583d53-0add-4758-8d8b-c309a79b4c48","Type":"ContainerStarted","Data":"238939e0ac613a93c7f81361efaa248cfbfc00a216328355e01173bb9d45efb1"} Mar 08 00:10:37 crc kubenswrapper[4713]: I0308 00:10:37.761780 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" podStartSLOduration=16.761751213 podStartE2EDuration="16.761751213s" podCreationTimestamp="2026-03-08 00:10:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:10:37.75902055 +0000 UTC m=+291.878652783" watchObservedRunningTime="2026-03-08 00:10:37.761751213 +0000 UTC m=+291.881383486" Mar 08 00:10:38 crc kubenswrapper[4713]: I0308 00:10:38.750992 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:38 crc kubenswrapper[4713]: I0308 00:10:38.756237 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:40 crc kubenswrapper[4713]: I0308 00:10:40.342817 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-z4s84 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 08 00:10:40 crc kubenswrapper[4713]: I0308 00:10:40.343132 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 08 00:10:40 crc kubenswrapper[4713]: I0308 00:10:40.343369 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-z4s84 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 08 00:10:40 crc kubenswrapper[4713]: I0308 00:10:40.343425 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 08 00:10:42 crc kubenswrapper[4713]: I0308 00:10:42.817436 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:10:42 crc kubenswrapper[4713]: I0308 00:10:42.817987 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:10:43 crc kubenswrapper[4713]: I0308 00:10:43.433632 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:10:43 crc kubenswrapper[4713]: I0308 00:10:43.824209 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:10:47 crc kubenswrapper[4713]: E0308 00:10:47.066281 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 08 00:10:47 crc kubenswrapper[4713]: E0308 00:10:47.066991 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nmk7f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-rdgpc_openshift-marketplace(dcde95f7-8814-4319-8a48-6d186de5f51f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 00:10:47 crc kubenswrapper[4713]: E0308 00:10:47.068145 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-rdgpc" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" Mar 08 00:10:47 crc kubenswrapper[4713]: E0308 00:10:47.808110 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-rdgpc" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" Mar 08 00:10:48 crc kubenswrapper[4713]: E0308 00:10:48.236936 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 08 00:10:48 crc kubenswrapper[4713]: E0308 00:10:48.237645 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kfdss,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-57pjt_openshift-marketplace(e23a30a2-2bf8-451e-b85b-b293e8949e9e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 00:10:48 crc kubenswrapper[4713]: E0308 00:10:48.239890 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-57pjt" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" Mar 08 00:10:48 crc kubenswrapper[4713]: E0308 00:10:48.813564 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-57pjt" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" Mar 08 00:10:50 crc kubenswrapper[4713]: I0308 00:10:50.358749 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-z4s84" Mar 08 00:10:50 crc kubenswrapper[4713]: I0308 00:10:50.823281 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hs88q" event={"ID":"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0","Type":"ContainerStarted","Data":"f5743c83cf849ed0707f05f9170f67beed9226bd36833eb3fea5238d2ff525b8"} Mar 08 00:10:50 crc kubenswrapper[4713]: I0308 00:10:50.826637 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4tj99" event={"ID":"40864d72-e137-478e-8340-8c0f107b4c60","Type":"ContainerStarted","Data":"46ee2fecb258f3bbeadd642b9e3423768d2062de8a5dd3a187b3ace78fd14497"} Mar 08 00:10:50 crc kubenswrapper[4713]: I0308 00:10:50.832753 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6gcb" event={"ID":"d9341928-7a63-4190-ac37-ac9ba3320e18","Type":"ContainerStarted","Data":"c0124cd1b5219c688a51426a00c55773b87427b1a16957ad745e3fd3a1ca06b1"} Mar 08 00:10:50 crc kubenswrapper[4713]: I0308 00:10:50.835404 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x7pkf" event={"ID":"c33b42a1-bf95-490f-a907-765855ec81d1","Type":"ContainerStarted","Data":"208d6f7268d01f9f7e50afe48b84246d8fc86cf25d817c7b3ce1701103741603"} Mar 08 00:10:50 crc kubenswrapper[4713]: I0308 00:10:50.837924 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pd9br" event={"ID":"cd4a956b-6edb-436e-bd5e-5d57899c2ea1","Type":"ContainerStarted","Data":"c2bf098434bfcc867c8195b8c42297c739230b688ab856c67dbf7a34e9987066"} Mar 08 00:10:51 crc kubenswrapper[4713]: I0308 00:10:51.848207 4713 generic.go:334] "Generic (PLEG): container finished" podID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" containerID="f5743c83cf849ed0707f05f9170f67beed9226bd36833eb3fea5238d2ff525b8" exitCode=0 Mar 08 00:10:51 crc kubenswrapper[4713]: I0308 00:10:51.848426 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hs88q" event={"ID":"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0","Type":"ContainerDied","Data":"f5743c83cf849ed0707f05f9170f67beed9226bd36833eb3fea5238d2ff525b8"} Mar 08 00:10:51 crc kubenswrapper[4713]: I0308 00:10:51.851256 4713 generic.go:334] "Generic (PLEG): container finished" podID="40864d72-e137-478e-8340-8c0f107b4c60" containerID="46ee2fecb258f3bbeadd642b9e3423768d2062de8a5dd3a187b3ace78fd14497" exitCode=0 Mar 08 00:10:51 crc kubenswrapper[4713]: I0308 00:10:51.851325 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4tj99" event={"ID":"40864d72-e137-478e-8340-8c0f107b4c60","Type":"ContainerDied","Data":"46ee2fecb258f3bbeadd642b9e3423768d2062de8a5dd3a187b3ace78fd14497"} Mar 08 00:10:51 crc kubenswrapper[4713]: I0308 00:10:51.854617 4713 generic.go:334] "Generic (PLEG): container finished" podID="d9341928-7a63-4190-ac37-ac9ba3320e18" containerID="c0124cd1b5219c688a51426a00c55773b87427b1a16957ad745e3fd3a1ca06b1" exitCode=0 Mar 08 00:10:51 crc kubenswrapper[4713]: I0308 00:10:51.854701 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6gcb" event={"ID":"d9341928-7a63-4190-ac37-ac9ba3320e18","Type":"ContainerDied","Data":"c0124cd1b5219c688a51426a00c55773b87427b1a16957ad745e3fd3a1ca06b1"} Mar 08 00:10:51 crc kubenswrapper[4713]: I0308 00:10:51.858915 4713 generic.go:334] "Generic (PLEG): container finished" podID="c33b42a1-bf95-490f-a907-765855ec81d1" containerID="208d6f7268d01f9f7e50afe48b84246d8fc86cf25d817c7b3ce1701103741603" exitCode=0 Mar 08 00:10:51 crc kubenswrapper[4713]: I0308 00:10:51.859010 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x7pkf" event={"ID":"c33b42a1-bf95-490f-a907-765855ec81d1","Type":"ContainerDied","Data":"208d6f7268d01f9f7e50afe48b84246d8fc86cf25d817c7b3ce1701103741603"} Mar 08 00:10:51 crc kubenswrapper[4713]: I0308 00:10:51.866114 4713 generic.go:334] "Generic (PLEG): container finished" podID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" containerID="c2bf098434bfcc867c8195b8c42297c739230b688ab856c67dbf7a34e9987066" exitCode=0 Mar 08 00:10:51 crc kubenswrapper[4713]: I0308 00:10:51.866161 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pd9br" event={"ID":"cd4a956b-6edb-436e-bd5e-5d57899c2ea1","Type":"ContainerDied","Data":"c2bf098434bfcc867c8195b8c42297c739230b688ab856c67dbf7a34e9987066"} Mar 08 00:10:55 crc kubenswrapper[4713]: I0308 00:10:55.073986 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:11:01 crc kubenswrapper[4713]: I0308 00:11:01.926432 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6gcb" event={"ID":"d9341928-7a63-4190-ac37-ac9ba3320e18","Type":"ContainerStarted","Data":"99dd020645e7b6695acb2f758f9b98023643a329f5c7e44db6eec7c1278babd6"} Mar 08 00:11:02 crc kubenswrapper[4713]: I0308 00:11:02.958713 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x6gcb" podStartSLOduration=5.154366956 podStartE2EDuration="1m22.958695066s" podCreationTimestamp="2026-03-08 00:09:40 +0000 UTC" firstStartedPulling="2026-03-08 00:09:42.18781589 +0000 UTC m=+236.307448123" lastFinishedPulling="2026-03-08 00:10:59.992144 +0000 UTC m=+314.111776233" observedRunningTime="2026-03-08 00:11:02.955770679 +0000 UTC m=+317.075402962" watchObservedRunningTime="2026-03-08 00:11:02.958695066 +0000 UTC m=+317.078327299" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.491861 4713 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 08 00:11:09 crc kubenswrapper[4713]: E0308 00:11:09.492655 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6470285d-4460-4c72-be17-00e880cc623d" containerName="oc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.492680 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6470285d-4460-4c72-be17-00e880cc623d" containerName="oc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.492884 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="6470285d-4460-4c72-be17-00e880cc623d" containerName="oc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.493336 4713 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.493470 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.493652 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd" gracePeriod=15 Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.493723 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7" gracePeriod=15 Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.493814 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7" gracePeriod=15 Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.493797 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637" gracePeriod=15 Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.493783 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e" gracePeriod=15 Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.494489 4713 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 08 00:11:09 crc kubenswrapper[4713]: E0308 00:11:09.494665 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.494684 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 00:11:09 crc kubenswrapper[4713]: E0308 00:11:09.494695 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.494704 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 00:11:09 crc kubenswrapper[4713]: E0308 00:11:09.494714 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.494723 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 08 00:11:09 crc kubenswrapper[4713]: E0308 00:11:09.494734 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.494742 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 00:11:09 crc kubenswrapper[4713]: E0308 00:11:09.494752 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.494761 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 08 00:11:09 crc kubenswrapper[4713]: E0308 00:11:09.494778 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.494790 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 08 00:11:09 crc kubenswrapper[4713]: E0308 00:11:09.494805 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.494816 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 08 00:11:09 crc kubenswrapper[4713]: E0308 00:11:09.494864 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.494874 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.495009 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.495019 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.495066 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.495079 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.495093 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.495104 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.495117 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.495131 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 08 00:11:09 crc kubenswrapper[4713]: E0308 00:11:09.495294 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.495304 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 00:11:09 crc kubenswrapper[4713]: E0308 00:11:09.495316 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.495325 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.495467 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.536575 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.551418 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.551467 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.551517 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.551541 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.551606 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.551623 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.551667 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.551693 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.652937 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.653155 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.653266 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.653306 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.653402 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.653380 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.653549 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.653574 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.653616 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.653656 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.653672 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.653666 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.653698 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.653748 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.653683 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.653808 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.831993 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.975540 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.976899 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.977653 4713 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7" exitCode=0 Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.977721 4713 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e" exitCode=0 Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.977733 4713 scope.go:117] "RemoveContainer" containerID="5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.977740 4713 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637" exitCode=0 Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.977750 4713 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7" exitCode=2 Mar 08 00:11:10 crc kubenswrapper[4713]: E0308 00:11:10.083719 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:11:10Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:11:10Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:11:10Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:11:10Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:063b8972231e65eb43f6545ba37804f68138dc54d97b91a652a1c5bc7dc76aa5\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:cf682d23b2857e455609879a0867d171a221c18e2cec995dd79570b77c5a4705\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1272201949},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e0c034ae18daa01af8d073f8cc24ae4af87883c664304910eab1167fdfd60c0b\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ef0c6b9e405f7a452211e063ce07ded04ccbe38b53860bfd71b5a7cd5072830a\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1229556414},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:79984dfbdf9aeae3985c7fd7515e12328775c0e7fc4782929d0998f4dd2a87c6\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:7be89499615ec913d0fe40ca89682080a3f1181a066dbc501c877cc7ccbcc9ae\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1220167376},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:10 crc kubenswrapper[4713]: E0308 00:11:10.084690 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:10 crc kubenswrapper[4713]: E0308 00:11:10.085258 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:10 crc kubenswrapper[4713]: E0308 00:11:10.085671 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:10 crc kubenswrapper[4713]: E0308 00:11:10.085971 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:10 crc kubenswrapper[4713]: E0308 00:11:10.086001 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 00:11:10 crc kubenswrapper[4713]: I0308 00:11:10.845367 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:11:10 crc kubenswrapper[4713]: I0308 00:11:10.845603 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:11:10 crc kubenswrapper[4713]: I0308 00:11:10.886756 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:11:10 crc kubenswrapper[4713]: I0308 00:11:10.887311 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:10 crc kubenswrapper[4713]: I0308 00:11:10.887803 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:10 crc kubenswrapper[4713]: I0308 00:11:10.986205 4713 generic.go:334] "Generic (PLEG): container finished" podID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" containerID="b5c6644f13e27288f2154b86d0cb3a5c886ae340b696eaaa05f0b93b6be6c6d6" exitCode=0 Mar 08 00:11:10 crc kubenswrapper[4713]: I0308 00:11:10.986323 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"dc51fa12-ec6c-48ee-8fd5-55388414d54f","Type":"ContainerDied","Data":"b5c6644f13e27288f2154b86d0cb3a5c886ae340b696eaaa05f0b93b6be6c6d6"} Mar 08 00:11:10 crc kubenswrapper[4713]: I0308 00:11:10.987072 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:10 crc kubenswrapper[4713]: I0308 00:11:10.987558 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:10 crc kubenswrapper[4713]: I0308 00:11:10.987914 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:11 crc kubenswrapper[4713]: I0308 00:11:11.026428 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:11:11 crc kubenswrapper[4713]: I0308 00:11:11.026973 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:11 crc kubenswrapper[4713]: I0308 00:11:11.027364 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:11 crc kubenswrapper[4713]: I0308 00:11:11.027710 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:11 crc kubenswrapper[4713]: E0308 00:11:11.106243 4713 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.188:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-marketplace-hs88q.189ab53ba5568682 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-hs88q,UID:2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0,APIVersion:v1,ResourceVersion:28774,FieldPath:spec.containers{registry-server},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\" in 19.254s (19.254s including waiting). Image size: 907837715 bytes.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:11:11.105320578 +0000 UTC m=+325.224952831,LastTimestamp:2026-03-08 00:11:11.105320578 +0000 UTC m=+325.224952831,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:11:11 crc kubenswrapper[4713]: I0308 00:11:11.996721 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 08 00:11:11 crc kubenswrapper[4713]: I0308 00:11:11.998052 4713 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd" exitCode=0 Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.500277 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.500894 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.501257 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.501605 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:12 crc kubenswrapper[4713]: W0308 00:11:12.550821 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-dcf7e359bd80d171b4b13b74a08f0371efc2c48ba7b96293cc536863b0f1e088 WatchSource:0}: Error finding container dcf7e359bd80d171b4b13b74a08f0371efc2c48ba7b96293cc536863b0f1e088: Status 404 returned error can't find the container with id dcf7e359bd80d171b4b13b74a08f0371efc2c48ba7b96293cc536863b0f1e088 Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.578978 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.580132 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.580424 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.580572 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.580702 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.580940 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.603160 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc51fa12-ec6c-48ee-8fd5-55388414d54f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "dc51fa12-ec6c-48ee-8fd5-55388414d54f" (UID: "dc51fa12-ec6c-48ee-8fd5-55388414d54f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.603262 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc51fa12-ec6c-48ee-8fd5-55388414d54f-kubelet-dir\") pod \"dc51fa12-ec6c-48ee-8fd5-55388414d54f\" (UID: \"dc51fa12-ec6c-48ee-8fd5-55388414d54f\") " Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.603381 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc51fa12-ec6c-48ee-8fd5-55388414d54f-kube-api-access\") pod \"dc51fa12-ec6c-48ee-8fd5-55388414d54f\" (UID: \"dc51fa12-ec6c-48ee-8fd5-55388414d54f\") " Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.603449 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dc51fa12-ec6c-48ee-8fd5-55388414d54f-var-lock\") pod \"dc51fa12-ec6c-48ee-8fd5-55388414d54f\" (UID: \"dc51fa12-ec6c-48ee-8fd5-55388414d54f\") " Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.603901 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc51fa12-ec6c-48ee-8fd5-55388414d54f-var-lock" (OuterVolumeSpecName: "var-lock") pod "dc51fa12-ec6c-48ee-8fd5-55388414d54f" (UID: "dc51fa12-ec6c-48ee-8fd5-55388414d54f"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.605960 4713 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dc51fa12-ec6c-48ee-8fd5-55388414d54f-var-lock\") on node \"crc\" DevicePath \"\"" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.605991 4713 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc51fa12-ec6c-48ee-8fd5-55388414d54f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.609503 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc51fa12-ec6c-48ee-8fd5-55388414d54f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "dc51fa12-ec6c-48ee-8fd5-55388414d54f" (UID: "dc51fa12-ec6c-48ee-8fd5-55388414d54f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.706985 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.707186 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.707332 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.707363 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.707615 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.708021 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.708303 4713 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.708752 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc51fa12-ec6c-48ee-8fd5-55388414d54f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.708773 4713 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.708782 4713 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.012549 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4tj99" event={"ID":"40864d72-e137-478e-8340-8c0f107b4c60","Type":"ContainerStarted","Data":"e4df11f30a00eeb8975bf590dfcc99035d1dbd89952445cfb19e1aa26d7407f6"} Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.014000 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.014246 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.014557 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.020852 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x7pkf" event={"ID":"c33b42a1-bf95-490f-a907-765855ec81d1","Type":"ContainerStarted","Data":"54d94291bba3da410042a68b46eeee3f18e230b96de2843a430f6d4aa0771496"} Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.022596 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.023363 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.023854 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.024148 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.024345 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.024486 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.024698 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.024852 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.027800 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"dc51fa12-ec6c-48ee-8fd5-55388414d54f","Type":"ContainerDied","Data":"d9172293da02dd75281be2c0f6a68b321d4fe6ee21fc35d92d3715acf36901df"} Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.027851 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9172293da02dd75281be2c0f6a68b321d4fe6ee21fc35d92d3715acf36901df" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.027948 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.033008 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"ee950c82c71f89197c3fdd129495b9b1ccc432ef6fac280107d19124be838293"} Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.033197 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"dcf7e359bd80d171b4b13b74a08f0371efc2c48ba7b96293cc536863b0f1e088"} Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.033625 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.034186 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.037924 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.038297 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.038677 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.039121 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.040329 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pd9br" event={"ID":"cd4a956b-6edb-436e-bd5e-5d57899c2ea1","Type":"ContainerStarted","Data":"a032630e16097c96141079adebfc1092e90366030a54b1b60ed4f6c7681a4c79"} Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.042025 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.042500 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.042746 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.043041 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.043499 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.045081 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.046397 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.046588 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-57pjt" event={"ID":"e23a30a2-2bf8-451e-b85b-b293e8949e9e","Type":"ContainerStarted","Data":"71df55d2c41e29b364984f11829b378396c7e97525399c55ef7102e7db5b6a0a"} Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.047334 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.047773 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.048138 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.048373 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.048744 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.052150 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rdgpc" event={"ID":"dcde95f7-8814-4319-8a48-6d186de5f51f","Type":"ContainerStarted","Data":"811a7fecc13f433a775d8c8b046af8802008222a2688bfa3140a6cccdba2f8bb"} Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.052228 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.053434 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.055440 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.056928 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.057243 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.057376 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.057298 4713 scope.go:117] "RemoveContainer" containerID="9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.057724 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.058002 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.058223 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.058474 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.058695 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.058952 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.059177 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.059390 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.075443 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hs88q" event={"ID":"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0","Type":"ContainerStarted","Data":"023ca4eb6026d184356661b957d297149cfe69e644ecd5ceb7a20eb3c76a9016"} Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.075527 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.078933 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.079215 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.079444 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.079668 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.079927 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.080287 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.080503 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.080722 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.081159 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.084987 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.085189 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.085555 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.088152 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.088281 4713 scope.go:117] "RemoveContainer" containerID="d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.088683 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.088899 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.089058 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.089214 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.089415 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.109224 4713 scope.go:117] "RemoveContainer" containerID="ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.132480 4713 scope.go:117] "RemoveContainer" containerID="3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.146799 4713 scope.go:117] "RemoveContainer" containerID="830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.164812 4713 scope.go:117] "RemoveContainer" containerID="982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.260532 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.260596 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.083270 4713 generic.go:334] "Generic (PLEG): container finished" podID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" containerID="71df55d2c41e29b364984f11829b378396c7e97525399c55ef7102e7db5b6a0a" exitCode=0 Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.083320 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-57pjt" event={"ID":"e23a30a2-2bf8-451e-b85b-b293e8949e9e","Type":"ContainerDied","Data":"71df55d2c41e29b364984f11829b378396c7e97525399c55ef7102e7db5b6a0a"} Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.084416 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.091181 4713 generic.go:334] "Generic (PLEG): container finished" podID="dcde95f7-8814-4319-8a48-6d186de5f51f" containerID="811a7fecc13f433a775d8c8b046af8802008222a2688bfa3140a6cccdba2f8bb" exitCode=0 Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.091215 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rdgpc" event={"ID":"dcde95f7-8814-4319-8a48-6d186de5f51f","Type":"ContainerDied","Data":"811a7fecc13f433a775d8c8b046af8802008222a2688bfa3140a6cccdba2f8bb"} Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.091179 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.091474 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.091882 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.093099 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.093512 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.093868 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.094096 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.094379 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.094737 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.095063 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.095297 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.095558 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.095755 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.096008 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.096268 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.097565 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.098058 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.099379 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.099698 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.303579 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-hs88q" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" containerName="registry-server" probeResult="failure" output=< Mar 08 00:11:14 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 08 00:11:14 crc kubenswrapper[4713]: > Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.547058 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.097804 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rdgpc" event={"ID":"dcde95f7-8814-4319-8a48-6d186de5f51f","Type":"ContainerStarted","Data":"bd4a8e19339f53886f8e1f05d3792cb1bb29da3b9e4c6bc029a48012b0bfe269"} Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.098780 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.099238 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.099561 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.099811 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.100078 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.100383 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.100622 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.100797 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-57pjt" event={"ID":"e23a30a2-2bf8-451e-b85b-b293e8949e9e","Type":"ContainerStarted","Data":"4ed848ed6abb07f4a89c3ace3ce761bce0134ceff6e51ed39e7ca6d27a1477c1"} Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.100911 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.101181 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.101550 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.101930 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.102347 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.102579 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.102868 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.103183 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.103447 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.103668 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.103915 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:16 crc kubenswrapper[4713]: I0308 00:11:16.543222 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:16 crc kubenswrapper[4713]: I0308 00:11:16.544350 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:16 crc kubenswrapper[4713]: I0308 00:11:16.544758 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:16 crc kubenswrapper[4713]: I0308 00:11:16.545101 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:16 crc kubenswrapper[4713]: I0308 00:11:16.545462 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:16 crc kubenswrapper[4713]: I0308 00:11:16.545797 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:16 crc kubenswrapper[4713]: I0308 00:11:16.546007 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:16 crc kubenswrapper[4713]: I0308 00:11:16.546169 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:16 crc kubenswrapper[4713]: I0308 00:11:16.546415 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:17 crc kubenswrapper[4713]: E0308 00:11:17.352308 4713 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.188:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-marketplace-hs88q.189ab53ba5568682 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-hs88q,UID:2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0,APIVersion:v1,ResourceVersion:28774,FieldPath:spec.containers{registry-server},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\" in 19.254s (19.254s including waiting). Image size: 907837715 bytes.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:11:11.105320578 +0000 UTC m=+325.224952831,LastTimestamp:2026-03-08 00:11:11.105320578 +0000 UTC m=+325.224952831,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:11:18 crc kubenswrapper[4713]: E0308 00:11:18.245338 4713 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:18 crc kubenswrapper[4713]: E0308 00:11:18.245855 4713 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:18 crc kubenswrapper[4713]: E0308 00:11:18.246361 4713 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:18 crc kubenswrapper[4713]: E0308 00:11:18.246646 4713 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:18 crc kubenswrapper[4713]: E0308 00:11:18.246949 4713 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:18 crc kubenswrapper[4713]: I0308 00:11:18.246987 4713 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 08 00:11:18 crc kubenswrapper[4713]: E0308 00:11:18.247209 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="200ms" Mar 08 00:11:18 crc kubenswrapper[4713]: E0308 00:11:18.448067 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="400ms" Mar 08 00:11:18 crc kubenswrapper[4713]: E0308 00:11:18.849543 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="800ms" Mar 08 00:11:19 crc kubenswrapper[4713]: E0308 00:11:19.650237 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="1.6s" Mar 08 00:11:20 crc kubenswrapper[4713]: E0308 00:11:20.375662 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:11:20Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:11:20Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:11:20Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:11:20Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:063b8972231e65eb43f6545ba37804f68138dc54d97b91a652a1c5bc7dc76aa5\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:cf682d23b2857e455609879a0867d171a221c18e2cec995dd79570b77c5a4705\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1272201949},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e0c034ae18daa01af8d073f8cc24ae4af87883c664304910eab1167fdfd60c0b\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ef0c6b9e405f7a452211e063ce07ded04ccbe38b53860bfd71b5a7cd5072830a\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1229556414},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:79984dfbdf9aeae3985c7fd7515e12328775c0e7fc4782929d0998f4dd2a87c6\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:7be89499615ec913d0fe40ca89682080a3f1181a066dbc501c877cc7ccbcc9ae\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1220167376},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:20 crc kubenswrapper[4713]: E0308 00:11:20.376216 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:20 crc kubenswrapper[4713]: E0308 00:11:20.376497 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:20 crc kubenswrapper[4713]: E0308 00:11:20.376846 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:20 crc kubenswrapper[4713]: E0308 00:11:20.377104 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:20 crc kubenswrapper[4713]: E0308 00:11:20.377129 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.042339 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.042405 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.080909 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.081463 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.081979 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.082219 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.082458 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.082788 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.083054 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.083353 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.083622 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.083879 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.176502 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.176805 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.177058 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.177293 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.177516 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.177811 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.178063 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.178266 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.178499 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.178725 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.239514 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.239621 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:11:21 crc kubenswrapper[4713]: E0308 00:11:21.251850 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="3.2s" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.278519 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.278966 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.279287 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.279488 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.279692 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.279899 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.280059 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.280279 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.280561 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.280817 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.485052 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.485145 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.528397 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.529075 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.529446 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.529979 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.530667 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.530990 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.531308 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.531890 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.532169 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.532500 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.172205 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.172665 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.172856 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.173078 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.173288 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.173617 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.174069 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.174523 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.175101 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.175362 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.192573 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.193403 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.193895 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.194255 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.194500 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.194735 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.194967 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.195182 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.195398 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.195618 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.142783 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.143959 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.144003 4713 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="b889b5cdcdafac4c08a37ddbf65fe6148e451c41914c8963bf50be9c84e84414" exitCode=1 Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.144113 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"b889b5cdcdafac4c08a37ddbf65fe6148e451c41914c8963bf50be9c84e84414"} Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.145850 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.146315 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.146756 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.146982 4713 scope.go:117] "RemoveContainer" containerID="b889b5cdcdafac4c08a37ddbf65fe6148e451c41914c8963bf50be9c84e84414" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.147006 4713 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.147295 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.147660 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.147944 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.148166 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.148369 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.148543 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.313052 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.313644 4713 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.313997 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.314219 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.314413 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.314676 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.315025 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.315426 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.315707 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.316024 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.316233 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.349337 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.349893 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.350307 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.350492 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.350642 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.350776 4713 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.350938 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.351072 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.351201 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.351330 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.351460 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.540642 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.541538 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.541886 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.542499 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.542730 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.545600 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.546031 4713 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.546248 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.546410 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.546565 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.546719 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.553457 4713 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="160301c9-6c5f-40f1-a40f-a0498b367a6e" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.553481 4713 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="160301c9-6c5f-40f1-a40f-a0498b367a6e" Mar 08 00:11:23 crc kubenswrapper[4713]: E0308 00:11:23.553903 4713 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.554605 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:23 crc kubenswrapper[4713]: W0308 00:11:23.572521 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-b26ba76973d9ec22bccc49af00997f36bc34ee4f2fda7a368a5405af52001fac WatchSource:0}: Error finding container b26ba76973d9ec22bccc49af00997f36bc34ee4f2fda7a368a5405af52001fac: Status 404 returned error can't find the container with id b26ba76973d9ec22bccc49af00997f36bc34ee4f2fda7a368a5405af52001fac Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.692463 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.152067 4713 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="ed090832d4b722ebd3fbecf4ff160ef991490ffe56d3217e0d5ae483ae265d9a" exitCode=0 Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.152157 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"ed090832d4b722ebd3fbecf4ff160ef991490ffe56d3217e0d5ae483ae265d9a"} Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.152441 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b26ba76973d9ec22bccc49af00997f36bc34ee4f2fda7a368a5405af52001fac"} Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.152737 4713 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="160301c9-6c5f-40f1-a40f-a0498b367a6e" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.152761 4713 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="160301c9-6c5f-40f1-a40f-a0498b367a6e" Mar 08 00:11:24 crc kubenswrapper[4713]: E0308 00:11:24.153147 4713 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.153165 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.153686 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.154118 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.154465 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.154764 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.155109 4713 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.155563 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.156037 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.156307 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.156439 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.156762 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.157470 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.157564 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bc756539acbbd6016530861f0ca3f1b19c51ce9445da649b72e4dbdfb56cf2b7"} Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.158599 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.159058 4713 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.159519 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.159906 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.160454 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.160933 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.161347 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.161847 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.162300 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.162880 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.237719 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.237776 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.281703 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.282125 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.283094 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.283959 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.284410 4713 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.284907 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.285313 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.285610 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.286058 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.286603 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.287142 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: E0308 00:11:24.453132 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="6.4s" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.692713 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.692782 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.729255 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.729856 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.730567 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.731027 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.731359 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.731661 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.731964 4713 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.732235 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.732534 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.732888 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.733160 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.203088 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.203854 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.204330 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.204778 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.205078 4713 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.205309 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.205596 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.205800 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.206003 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.206182 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.206389 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.209238 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.209752 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.209968 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.210227 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.210569 4713 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.210804 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.211047 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.211259 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.211476 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.211685 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.211972 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:26 crc kubenswrapper[4713]: I0308 00:11:26.169649 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1cd7e25d02054b293f534ff2e47e1f55bee990db4d8ab079e3a609f0ad8ebcdf"} Mar 08 00:11:27 crc kubenswrapper[4713]: I0308 00:11:27.178794 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"42356eae2569e4cdceef545ada9e3f57b0018356b39cd47ad055a4dfb933acc9"} Mar 08 00:11:27 crc kubenswrapper[4713]: I0308 00:11:27.179938 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:27 crc kubenswrapper[4713]: I0308 00:11:27.180059 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"81d9f67cad6662a85e214fa9b3812349ae8adddb5bebd4bd202c6f33e7b6be24"} Mar 08 00:11:27 crc kubenswrapper[4713]: I0308 00:11:27.180220 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c4ebf0f92d1e7564cc3acf1efa9ad3009b0cd48b1b1a27c985a0e02a8a3b19b4"} Mar 08 00:11:27 crc kubenswrapper[4713]: I0308 00:11:27.180336 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ded44558bde8bbf893974ab43495d67acd5c3f360394bd11d2a4a5a3eccce799"} Mar 08 00:11:27 crc kubenswrapper[4713]: I0308 00:11:27.179297 4713 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="160301c9-6c5f-40f1-a40f-a0498b367a6e" Mar 08 00:11:27 crc kubenswrapper[4713]: I0308 00:11:27.180530 4713 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="160301c9-6c5f-40f1-a40f-a0498b367a6e" Mar 08 00:11:28 crc kubenswrapper[4713]: I0308 00:11:28.555298 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:28 crc kubenswrapper[4713]: I0308 00:11:28.555344 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:28 crc kubenswrapper[4713]: I0308 00:11:28.560452 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:29 crc kubenswrapper[4713]: I0308 00:11:29.626755 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:11:29 crc kubenswrapper[4713]: I0308 00:11:29.627095 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:11:29 crc kubenswrapper[4713]: I0308 00:11:29.629164 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 08 00:11:29 crc kubenswrapper[4713]: I0308 00:11:29.629959 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 08 00:11:29 crc kubenswrapper[4713]: I0308 00:11:29.638864 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:11:29 crc kubenswrapper[4713]: I0308 00:11:29.643503 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:11:29 crc kubenswrapper[4713]: I0308 00:11:29.650497 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:11:30 crc kubenswrapper[4713]: W0308 00:11:30.098411 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-d60597421c14a2c2522f0eb569437438b3518aeead10cf41acd7da94682afead WatchSource:0}: Error finding container d60597421c14a2c2522f0eb569437438b3518aeead10cf41acd7da94682afead: Status 404 returned error can't find the container with id d60597421c14a2c2522f0eb569437438b3518aeead10cf41acd7da94682afead Mar 08 00:11:30 crc kubenswrapper[4713]: I0308 00:11:30.198304 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"d60597421c14a2c2522f0eb569437438b3518aeead10cf41acd7da94682afead"} Mar 08 00:11:31 crc kubenswrapper[4713]: I0308 00:11:31.005642 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:11:31 crc kubenswrapper[4713]: I0308 00:11:31.012604 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:11:31 crc kubenswrapper[4713]: I0308 00:11:31.206597 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"795b7cfb0b5268f531cd919ce190748af9e0691f8457e2d8607f31d4374958cd"} Mar 08 00:11:31 crc kubenswrapper[4713]: I0308 00:11:31.207007 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:11:32 crc kubenswrapper[4713]: I0308 00:11:32.187593 4713 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:33 crc kubenswrapper[4713]: I0308 00:11:33.215415 4713 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="160301c9-6c5f-40f1-a40f-a0498b367a6e" Mar 08 00:11:33 crc kubenswrapper[4713]: I0308 00:11:33.215447 4713 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="160301c9-6c5f-40f1-a40f-a0498b367a6e" Mar 08 00:11:33 crc kubenswrapper[4713]: I0308 00:11:33.218940 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:33 crc kubenswrapper[4713]: I0308 00:11:33.223964 4713 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="7d8bb7b8-9c57-40e6-90fc-52441b10732b" Mar 08 00:11:34 crc kubenswrapper[4713]: I0308 00:11:34.219876 4713 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="160301c9-6c5f-40f1-a40f-a0498b367a6e" Mar 08 00:11:34 crc kubenswrapper[4713]: I0308 00:11:34.219902 4713 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="160301c9-6c5f-40f1-a40f-a0498b367a6e" Mar 08 00:11:36 crc kubenswrapper[4713]: I0308 00:11:36.562808 4713 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="7d8bb7b8-9c57-40e6-90fc-52441b10732b" Mar 08 00:11:41 crc kubenswrapper[4713]: I0308 00:11:41.370226 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:11:41 crc kubenswrapper[4713]: I0308 00:11:41.422324 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 08 00:11:41 crc kubenswrapper[4713]: I0308 00:11:41.717944 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 08 00:11:42 crc kubenswrapper[4713]: I0308 00:11:42.029618 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 08 00:11:42 crc kubenswrapper[4713]: I0308 00:11:42.141524 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 00:11:42 crc kubenswrapper[4713]: I0308 00:11:42.476303 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Mar 08 00:11:42 crc kubenswrapper[4713]: I0308 00:11:42.529947 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 08 00:11:42 crc kubenswrapper[4713]: I0308 00:11:42.673415 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 08 00:11:43 crc kubenswrapper[4713]: I0308 00:11:43.209770 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 08 00:11:43 crc kubenswrapper[4713]: I0308 00:11:43.400969 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 08 00:11:43 crc kubenswrapper[4713]: I0308 00:11:43.495281 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 08 00:11:43 crc kubenswrapper[4713]: I0308 00:11:43.519089 4713 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 08 00:11:43 crc kubenswrapper[4713]: I0308 00:11:43.673512 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 08 00:11:43 crc kubenswrapper[4713]: I0308 00:11:43.689259 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 00:11:43 crc kubenswrapper[4713]: I0308 00:11:43.723290 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 08 00:11:43 crc kubenswrapper[4713]: I0308 00:11:43.800700 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Mar 08 00:11:43 crc kubenswrapper[4713]: I0308 00:11:43.895122 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 08 00:11:43 crc kubenswrapper[4713]: I0308 00:11:43.960947 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 00:11:44 crc kubenswrapper[4713]: I0308 00:11:44.049195 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 08 00:11:44 crc kubenswrapper[4713]: I0308 00:11:44.066672 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 08 00:11:44 crc kubenswrapper[4713]: I0308 00:11:44.376440 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 08 00:11:44 crc kubenswrapper[4713]: I0308 00:11:44.638407 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 08 00:11:44 crc kubenswrapper[4713]: I0308 00:11:44.639630 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 08 00:11:44 crc kubenswrapper[4713]: I0308 00:11:44.870523 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 08 00:11:44 crc kubenswrapper[4713]: I0308 00:11:44.877766 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 08 00:11:44 crc kubenswrapper[4713]: I0308 00:11:44.908160 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 08 00:11:44 crc kubenswrapper[4713]: I0308 00:11:44.947923 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 08 00:11:44 crc kubenswrapper[4713]: I0308 00:11:44.977121 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.080435 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.149017 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.173412 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.211414 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.217239 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.314182 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.372120 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.419976 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.427053 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.469714 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.539144 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.546621 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.581980 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.608153 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.636636 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.703188 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.742210 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.869629 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.894568 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.952577 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.958360 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.986984 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.065650 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.068365 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.068856 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.157606 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.177272 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.177985 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.343740 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.434875 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.452229 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.488657 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.495042 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.524950 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.535331 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.584010 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.596141 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.647914 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.671158 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.671654 4713 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.672048 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pd9br" podStartSLOduration=36.546779217 podStartE2EDuration="2m5.672032273s" podCreationTimestamp="2026-03-08 00:09:41 +0000 UTC" firstStartedPulling="2026-03-08 00:09:43.284892235 +0000 UTC m=+237.404524468" lastFinishedPulling="2026-03-08 00:11:12.410145281 +0000 UTC m=+326.529777524" observedRunningTime="2026-03-08 00:11:31.955153349 +0000 UTC m=+346.074785582" watchObservedRunningTime="2026-03-08 00:11:46.672032273 +0000 UTC m=+360.791664516" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.674037 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=37.67402798 podStartE2EDuration="37.67402798s" podCreationTimestamp="2026-03-08 00:11:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:11:31.823721772 +0000 UTC m=+345.943354005" watchObservedRunningTime="2026-03-08 00:11:46.67402798 +0000 UTC m=+360.793660233" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.674350 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-57pjt" podStartSLOduration=80.847170258 podStartE2EDuration="2m3.67434513s" podCreationTimestamp="2026-03-08 00:09:43 +0000 UTC" firstStartedPulling="2026-03-08 00:10:31.656635352 +0000 UTC m=+285.776267585" lastFinishedPulling="2026-03-08 00:11:14.483810224 +0000 UTC m=+328.603442457" observedRunningTime="2026-03-08 00:11:31.970373469 +0000 UTC m=+346.090005722" watchObservedRunningTime="2026-03-08 00:11:46.67434513 +0000 UTC m=+360.793977363" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.674546 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rdgpc" podStartSLOduration=79.538307035 podStartE2EDuration="2m2.674541025s" podCreationTimestamp="2026-03-08 00:09:44 +0000 UTC" firstStartedPulling="2026-03-08 00:10:31.659743604 +0000 UTC m=+285.779375837" lastFinishedPulling="2026-03-08 00:11:14.795977594 +0000 UTC m=+328.915609827" observedRunningTime="2026-03-08 00:11:31.888505054 +0000 UTC m=+346.008137287" watchObservedRunningTime="2026-03-08 00:11:46.674541025 +0000 UTC m=+360.794173268" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.674808 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x7pkf" podStartSLOduration=38.755281452 podStartE2EDuration="2m6.674803793s" podCreationTimestamp="2026-03-08 00:09:40 +0000 UTC" firstStartedPulling="2026-03-08 00:09:43.24727236 +0000 UTC m=+237.366904593" lastFinishedPulling="2026-03-08 00:11:11.166794701 +0000 UTC m=+325.286426934" observedRunningTime="2026-03-08 00:11:31.912757865 +0000 UTC m=+346.032390098" watchObservedRunningTime="2026-03-08 00:11:46.674803793 +0000 UTC m=+360.794436026" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.674921 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4tj99" podStartSLOduration=37.409628292 podStartE2EDuration="2m6.674917796s" podCreationTimestamp="2026-03-08 00:09:40 +0000 UTC" firstStartedPulling="2026-03-08 00:09:43.22698253 +0000 UTC m=+237.346614763" lastFinishedPulling="2026-03-08 00:11:12.492272034 +0000 UTC m=+326.611904267" observedRunningTime="2026-03-08 00:11:31.941523755 +0000 UTC m=+346.061155988" watchObservedRunningTime="2026-03-08 00:11:46.674917796 +0000 UTC m=+360.794550039" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.677907 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hs88q" podStartSLOduration=42.383095358 podStartE2EDuration="2m4.677896312s" podCreationTimestamp="2026-03-08 00:09:42 +0000 UTC" firstStartedPulling="2026-03-08 00:09:48.810456852 +0000 UTC m=+242.930089085" lastFinishedPulling="2026-03-08 00:11:11.105257786 +0000 UTC m=+325.224890039" observedRunningTime="2026-03-08 00:11:31.875313853 +0000 UTC m=+345.994946086" watchObservedRunningTime="2026-03-08 00:11:46.677896312 +0000 UTC m=+360.797528565" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.678718 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.678762 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.682502 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.718433 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.718411573000001 podStartE2EDuration="14.718411573s" podCreationTimestamp="2026-03-08 00:11:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:11:46.697419586 +0000 UTC m=+360.817051839" watchObservedRunningTime="2026-03-08 00:11:46.718411573 +0000 UTC m=+360.838043806" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.746580 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.784277 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.864542 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.864765 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.900740 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.917490 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.989852 4713 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 08 00:11:47 crc kubenswrapper[4713]: I0308 00:11:47.050100 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 08 00:11:47 crc kubenswrapper[4713]: I0308 00:11:47.155288 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 08 00:11:47 crc kubenswrapper[4713]: I0308 00:11:47.378338 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 00:11:47 crc kubenswrapper[4713]: I0308 00:11:47.380560 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 00:11:47 crc kubenswrapper[4713]: I0308 00:11:47.569328 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 08 00:11:47 crc kubenswrapper[4713]: I0308 00:11:47.669156 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 08 00:11:47 crc kubenswrapper[4713]: I0308 00:11:47.674365 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 08 00:11:47 crc kubenswrapper[4713]: I0308 00:11:47.681233 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 08 00:11:47 crc kubenswrapper[4713]: I0308 00:11:47.691416 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 08 00:11:47 crc kubenswrapper[4713]: I0308 00:11:47.896622 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.200495 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.233204 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.392636 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.400752 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.413357 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.457935 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.482914 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.518062 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.522101 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.594488 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.599893 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.654177 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.748437 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.846664 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.982670 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.993397 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.995678 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 08 00:11:49 crc kubenswrapper[4713]: I0308 00:11:49.083761 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 08 00:11:49 crc kubenswrapper[4713]: I0308 00:11:49.123068 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 08 00:11:49 crc kubenswrapper[4713]: I0308 00:11:49.158962 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 08 00:11:49 crc kubenswrapper[4713]: I0308 00:11:49.163860 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 08 00:11:49 crc kubenswrapper[4713]: I0308 00:11:49.266173 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 08 00:11:49 crc kubenswrapper[4713]: I0308 00:11:49.388591 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 08 00:11:49 crc kubenswrapper[4713]: I0308 00:11:49.452991 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 08 00:11:49 crc kubenswrapper[4713]: I0308 00:11:49.613447 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 08 00:11:49 crc kubenswrapper[4713]: I0308 00:11:49.660718 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 08 00:11:49 crc kubenswrapper[4713]: I0308 00:11:49.679474 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 08 00:11:49 crc kubenswrapper[4713]: I0308 00:11:49.807582 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 08 00:11:49 crc kubenswrapper[4713]: I0308 00:11:49.927436 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 08 00:11:49 crc kubenswrapper[4713]: I0308 00:11:49.932058 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 08 00:11:49 crc kubenswrapper[4713]: I0308 00:11:49.932931 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 08 00:11:49 crc kubenswrapper[4713]: I0308 00:11:49.956766 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.038493 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.177768 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.197275 4713 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.252511 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.296184 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.416507 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.575916 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.621719 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.635716 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.695714 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.703861 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.716065 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.738907 4713 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.755260 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.804655 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.899487 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.969779 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.983053 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.993309 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 08 00:11:51 crc kubenswrapper[4713]: I0308 00:11:51.044879 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 08 00:11:51 crc kubenswrapper[4713]: I0308 00:11:51.049769 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 08 00:11:51 crc kubenswrapper[4713]: I0308 00:11:51.054177 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 08 00:11:51 crc kubenswrapper[4713]: I0308 00:11:51.125882 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 08 00:11:51 crc kubenswrapper[4713]: I0308 00:11:51.138552 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 08 00:11:51 crc kubenswrapper[4713]: I0308 00:11:51.212732 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 08 00:11:51 crc kubenswrapper[4713]: I0308 00:11:51.327600 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 08 00:11:51 crc kubenswrapper[4713]: I0308 00:11:51.371212 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 08 00:11:51 crc kubenswrapper[4713]: I0308 00:11:51.516162 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 08 00:11:51 crc kubenswrapper[4713]: I0308 00:11:51.534379 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 08 00:11:51 crc kubenswrapper[4713]: I0308 00:11:51.572240 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 08 00:11:51 crc kubenswrapper[4713]: I0308 00:11:51.581428 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 08 00:11:51 crc kubenswrapper[4713]: I0308 00:11:51.622228 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 08 00:11:51 crc kubenswrapper[4713]: I0308 00:11:51.687276 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 08 00:11:51 crc kubenswrapper[4713]: I0308 00:11:51.716381 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 08 00:11:51 crc kubenswrapper[4713]: I0308 00:11:51.733237 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.053420 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.194595 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.225252 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.258593 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.283030 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.346121 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.398451 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.441808 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.478929 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.519724 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.546314 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.547805 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.554380 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.597073 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.602551 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.620415 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.672788 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.728463 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.783882 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.784547 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.803678 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.857275 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.886520 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.980057 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.149311 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.197211 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.297360 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.317219 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.332108 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.469150 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.480469 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.516436 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.607704 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.613800 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.641570 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.646857 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.867302 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.878758 4713 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.898112 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.939574 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.970172 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.003492 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.007370 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.025212 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.061388 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.154807 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.333805 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.343242 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.390375 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.432725 4713 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.432950 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://ee950c82c71f89197c3fdd129495b9b1ccc432ef6fac280107d19124be838293" gracePeriod=5 Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.553977 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.620264 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.709988 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.783922 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.796418 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.810645 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.970403 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.984150 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.994565 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.032410 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.062567 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.083648 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.237929 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.241916 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.333727 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.356771 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.361748 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.364274 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.418228 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.472175 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.653368 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.694731 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.849447 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.852728 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.876100 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.917391 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.994725 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 08 00:11:56 crc kubenswrapper[4713]: I0308 00:11:56.010919 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 08 00:11:56 crc kubenswrapper[4713]: I0308 00:11:56.048716 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 08 00:11:56 crc kubenswrapper[4713]: I0308 00:11:56.058656 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 08 00:11:56 crc kubenswrapper[4713]: I0308 00:11:56.153110 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 08 00:11:56 crc kubenswrapper[4713]: I0308 00:11:56.170849 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 08 00:11:56 crc kubenswrapper[4713]: I0308 00:11:56.218978 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 08 00:11:56 crc kubenswrapper[4713]: I0308 00:11:56.227274 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 08 00:11:56 crc kubenswrapper[4713]: I0308 00:11:56.334970 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 08 00:11:56 crc kubenswrapper[4713]: I0308 00:11:56.433659 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 08 00:11:56 crc kubenswrapper[4713]: I0308 00:11:56.496040 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 08 00:11:56 crc kubenswrapper[4713]: I0308 00:11:56.515457 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 08 00:11:56 crc kubenswrapper[4713]: I0308 00:11:56.778944 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 08 00:11:57 crc kubenswrapper[4713]: I0308 00:11:57.047685 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 08 00:11:57 crc kubenswrapper[4713]: I0308 00:11:57.064440 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 08 00:11:57 crc kubenswrapper[4713]: I0308 00:11:57.375546 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 08 00:11:57 crc kubenswrapper[4713]: I0308 00:11:57.455119 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 08 00:11:57 crc kubenswrapper[4713]: I0308 00:11:57.476723 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 08 00:11:57 crc kubenswrapper[4713]: I0308 00:11:57.644025 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 08 00:11:58 crc kubenswrapper[4713]: I0308 00:11:58.025130 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 08 00:11:58 crc kubenswrapper[4713]: I0308 00:11:58.222984 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 08 00:11:58 crc kubenswrapper[4713]: I0308 00:11:58.695993 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.038525 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.038965 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.168409 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548812-24fjw"] Mar 08 00:12:00 crc kubenswrapper[4713]: E0308 00:12:00.168867 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.168901 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 08 00:12:00 crc kubenswrapper[4713]: E0308 00:12:00.168938 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" containerName="installer" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.168958 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" containerName="installer" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.169194 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" containerName="installer" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.169233 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.170208 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548812-24fjw" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.173035 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.174386 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.174406 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.177570 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548812-24fjw"] Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.205523 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.205618 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.205658 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.205693 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.205727 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.206177 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.206243 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.206281 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.206624 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.215431 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.306944 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ncv9\" (UniqueName: \"kubernetes.io/projected/12cdabef-a56e-45d2-8896-aab98bd84fb1-kube-api-access-6ncv9\") pod \"auto-csr-approver-29548812-24fjw\" (UID: \"12cdabef-a56e-45d2-8896-aab98bd84fb1\") " pod="openshift-infra/auto-csr-approver-29548812-24fjw" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.307281 4713 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.307371 4713 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.307458 4713 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.307529 4713 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.307602 4713 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.359852 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.359900 4713 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="ee950c82c71f89197c3fdd129495b9b1ccc432ef6fac280107d19124be838293" exitCode=137 Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.360034 4713 scope.go:117] "RemoveContainer" containerID="ee950c82c71f89197c3fdd129495b9b1ccc432ef6fac280107d19124be838293" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.360046 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.384589 4713 scope.go:117] "RemoveContainer" containerID="ee950c82c71f89197c3fdd129495b9b1ccc432ef6fac280107d19124be838293" Mar 08 00:12:00 crc kubenswrapper[4713]: E0308 00:12:00.385004 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee950c82c71f89197c3fdd129495b9b1ccc432ef6fac280107d19124be838293\": container with ID starting with ee950c82c71f89197c3fdd129495b9b1ccc432ef6fac280107d19124be838293 not found: ID does not exist" containerID="ee950c82c71f89197c3fdd129495b9b1ccc432ef6fac280107d19124be838293" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.385051 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee950c82c71f89197c3fdd129495b9b1ccc432ef6fac280107d19124be838293"} err="failed to get container status \"ee950c82c71f89197c3fdd129495b9b1ccc432ef6fac280107d19124be838293\": rpc error: code = NotFound desc = could not find container \"ee950c82c71f89197c3fdd129495b9b1ccc432ef6fac280107d19124be838293\": container with ID starting with ee950c82c71f89197c3fdd129495b9b1ccc432ef6fac280107d19124be838293 not found: ID does not exist" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.408809 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ncv9\" (UniqueName: \"kubernetes.io/projected/12cdabef-a56e-45d2-8896-aab98bd84fb1-kube-api-access-6ncv9\") pod \"auto-csr-approver-29548812-24fjw\" (UID: \"12cdabef-a56e-45d2-8896-aab98bd84fb1\") " pod="openshift-infra/auto-csr-approver-29548812-24fjw" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.429883 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ncv9\" (UniqueName: \"kubernetes.io/projected/12cdabef-a56e-45d2-8896-aab98bd84fb1-kube-api-access-6ncv9\") pod \"auto-csr-approver-29548812-24fjw\" (UID: \"12cdabef-a56e-45d2-8896-aab98bd84fb1\") " pod="openshift-infra/auto-csr-approver-29548812-24fjw" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.499338 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548812-24fjw" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.547302 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.547579 4713 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.558085 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.558128 4713 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="1b60054b-377a-42aa-a77b-1946ed626065" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.569739 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.569789 4713 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="1b60054b-377a-42aa-a77b-1946ed626065" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.891630 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548812-24fjw"] Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.366170 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548812-24fjw" event={"ID":"12cdabef-a56e-45d2-8896-aab98bd84fb1","Type":"ContainerStarted","Data":"4f3257c130a12b7f62d39b42bf8c076b22c12811abedce81b9b8ef554ca7f546"} Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.402088 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b59c8fc9c-nklnq"] Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.402336 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" podUID="58583d53-0add-4758-8d8b-c309a79b4c48" containerName="controller-manager" containerID="cri-o://238939e0ac613a93c7f81361efaa248cfbfc00a216328355e01173bb9d45efb1" gracePeriod=30 Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.496488 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl"] Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.497079 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" podUID="7daca87e-5103-46bd-b6ae-7643c66a4fbc" containerName="route-controller-manager" containerID="cri-o://60b716d027634d1d9bfd56752b1e12c7b7eb837d727fb4d3708bc8b18f7698a3" gracePeriod=30 Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.748193 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.854557 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.927585 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58583d53-0add-4758-8d8b-c309a79b4c48-serving-cert\") pod \"58583d53-0add-4758-8d8b-c309a79b4c48\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.927625 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48rlp\" (UniqueName: \"kubernetes.io/projected/58583d53-0add-4758-8d8b-c309a79b4c48-kube-api-access-48rlp\") pod \"58583d53-0add-4758-8d8b-c309a79b4c48\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.927662 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-client-ca\") pod \"58583d53-0add-4758-8d8b-c309a79b4c48\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.927684 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-config\") pod \"58583d53-0add-4758-8d8b-c309a79b4c48\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.927730 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-proxy-ca-bundles\") pod \"58583d53-0add-4758-8d8b-c309a79b4c48\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.928730 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-config" (OuterVolumeSpecName: "config") pod "58583d53-0add-4758-8d8b-c309a79b4c48" (UID: "58583d53-0add-4758-8d8b-c309a79b4c48"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.928726 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "58583d53-0add-4758-8d8b-c309a79b4c48" (UID: "58583d53-0add-4758-8d8b-c309a79b4c48"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.929077 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.929108 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.929264 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-client-ca" (OuterVolumeSpecName: "client-ca") pod "58583d53-0add-4758-8d8b-c309a79b4c48" (UID: "58583d53-0add-4758-8d8b-c309a79b4c48"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.932710 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58583d53-0add-4758-8d8b-c309a79b4c48-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "58583d53-0add-4758-8d8b-c309a79b4c48" (UID: "58583d53-0add-4758-8d8b-c309a79b4c48"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.932812 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58583d53-0add-4758-8d8b-c309a79b4c48-kube-api-access-48rlp" (OuterVolumeSpecName: "kube-api-access-48rlp") pod "58583d53-0add-4758-8d8b-c309a79b4c48" (UID: "58583d53-0add-4758-8d8b-c309a79b4c48"). InnerVolumeSpecName "kube-api-access-48rlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.029794 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjt7x\" (UniqueName: \"kubernetes.io/projected/7daca87e-5103-46bd-b6ae-7643c66a4fbc-kube-api-access-zjt7x\") pod \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\" (UID: \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\") " Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.029875 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7daca87e-5103-46bd-b6ae-7643c66a4fbc-config\") pod \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\" (UID: \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\") " Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.029957 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7daca87e-5103-46bd-b6ae-7643c66a4fbc-client-ca\") pod \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\" (UID: \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\") " Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.029992 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7daca87e-5103-46bd-b6ae-7643c66a4fbc-serving-cert\") pod \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\" (UID: \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\") " Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.030715 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7daca87e-5103-46bd-b6ae-7643c66a4fbc-client-ca" (OuterVolumeSpecName: "client-ca") pod "7daca87e-5103-46bd-b6ae-7643c66a4fbc" (UID: "7daca87e-5103-46bd-b6ae-7643c66a4fbc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.030722 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58583d53-0add-4758-8d8b-c309a79b4c48-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.030764 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48rlp\" (UniqueName: \"kubernetes.io/projected/58583d53-0add-4758-8d8b-c309a79b4c48-kube-api-access-48rlp\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.030778 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.030880 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7daca87e-5103-46bd-b6ae-7643c66a4fbc-config" (OuterVolumeSpecName: "config") pod "7daca87e-5103-46bd-b6ae-7643c66a4fbc" (UID: "7daca87e-5103-46bd-b6ae-7643c66a4fbc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.032732 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7daca87e-5103-46bd-b6ae-7643c66a4fbc-kube-api-access-zjt7x" (OuterVolumeSpecName: "kube-api-access-zjt7x") pod "7daca87e-5103-46bd-b6ae-7643c66a4fbc" (UID: "7daca87e-5103-46bd-b6ae-7643c66a4fbc"). InnerVolumeSpecName "kube-api-access-zjt7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.032974 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7daca87e-5103-46bd-b6ae-7643c66a4fbc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7daca87e-5103-46bd-b6ae-7643c66a4fbc" (UID: "7daca87e-5103-46bd-b6ae-7643c66a4fbc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.132188 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjt7x\" (UniqueName: \"kubernetes.io/projected/7daca87e-5103-46bd-b6ae-7643c66a4fbc-kube-api-access-zjt7x\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.132218 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7daca87e-5103-46bd-b6ae-7643c66a4fbc-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.132227 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7daca87e-5103-46bd-b6ae-7643c66a4fbc-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.132237 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7daca87e-5103-46bd-b6ae-7643c66a4fbc-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.372440 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548812-24fjw" event={"ID":"12cdabef-a56e-45d2-8896-aab98bd84fb1","Type":"ContainerStarted","Data":"71f869c9a3deae4099eb6a9e0da68e9d0801b114263bfc45efc59f3dae8002be"} Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.373952 4713 generic.go:334] "Generic (PLEG): container finished" podID="58583d53-0add-4758-8d8b-c309a79b4c48" containerID="238939e0ac613a93c7f81361efaa248cfbfc00a216328355e01173bb9d45efb1" exitCode=0 Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.374016 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" event={"ID":"58583d53-0add-4758-8d8b-c309a79b4c48","Type":"ContainerDied","Data":"238939e0ac613a93c7f81361efaa248cfbfc00a216328355e01173bb9d45efb1"} Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.374036 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" event={"ID":"58583d53-0add-4758-8d8b-c309a79b4c48","Type":"ContainerDied","Data":"bf14b4768a06207e44a9e2b8f817f874dac0b317715a2c1cef7640a7a7b1ee98"} Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.374043 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.374056 4713 scope.go:117] "RemoveContainer" containerID="238939e0ac613a93c7f81361efaa248cfbfc00a216328355e01173bb9d45efb1" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.376052 4713 generic.go:334] "Generic (PLEG): container finished" podID="7daca87e-5103-46bd-b6ae-7643c66a4fbc" containerID="60b716d027634d1d9bfd56752b1e12c7b7eb837d727fb4d3708bc8b18f7698a3" exitCode=0 Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.376085 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" event={"ID":"7daca87e-5103-46bd-b6ae-7643c66a4fbc","Type":"ContainerDied","Data":"60b716d027634d1d9bfd56752b1e12c7b7eb837d727fb4d3708bc8b18f7698a3"} Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.376107 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" event={"ID":"7daca87e-5103-46bd-b6ae-7643c66a4fbc","Type":"ContainerDied","Data":"ca8e90ef695a32802124e9aceef3123bdb89dbe43217f030e702dfd71adfbdc7"} Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.376139 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.389263 4713 scope.go:117] "RemoveContainer" containerID="238939e0ac613a93c7f81361efaa248cfbfc00a216328355e01173bb9d45efb1" Mar 08 00:12:02 crc kubenswrapper[4713]: E0308 00:12:02.392392 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"238939e0ac613a93c7f81361efaa248cfbfc00a216328355e01173bb9d45efb1\": container with ID starting with 238939e0ac613a93c7f81361efaa248cfbfc00a216328355e01173bb9d45efb1 not found: ID does not exist" containerID="238939e0ac613a93c7f81361efaa248cfbfc00a216328355e01173bb9d45efb1" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.392480 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"238939e0ac613a93c7f81361efaa248cfbfc00a216328355e01173bb9d45efb1"} err="failed to get container status \"238939e0ac613a93c7f81361efaa248cfbfc00a216328355e01173bb9d45efb1\": rpc error: code = NotFound desc = could not find container \"238939e0ac613a93c7f81361efaa248cfbfc00a216328355e01173bb9d45efb1\": container with ID starting with 238939e0ac613a93c7f81361efaa248cfbfc00a216328355e01173bb9d45efb1 not found: ID does not exist" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.392520 4713 scope.go:117] "RemoveContainer" containerID="60b716d027634d1d9bfd56752b1e12c7b7eb837d727fb4d3708bc8b18f7698a3" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.400473 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29548812-24fjw" podStartSLOduration=1.30879019 podStartE2EDuration="2.400454991s" podCreationTimestamp="2026-03-08 00:12:00 +0000 UTC" firstStartedPulling="2026-03-08 00:12:00.90143606 +0000 UTC m=+375.021068323" lastFinishedPulling="2026-03-08 00:12:01.993100891 +0000 UTC m=+376.112733124" observedRunningTime="2026-03-08 00:12:02.391896804 +0000 UTC m=+376.511529057" watchObservedRunningTime="2026-03-08 00:12:02.400454991 +0000 UTC m=+376.520087234" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.408194 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl"] Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.411193 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl"] Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.414191 4713 scope.go:117] "RemoveContainer" containerID="60b716d027634d1d9bfd56752b1e12c7b7eb837d727fb4d3708bc8b18f7698a3" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.414525 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b59c8fc9c-nklnq"] Mar 08 00:12:02 crc kubenswrapper[4713]: E0308 00:12:02.414586 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60b716d027634d1d9bfd56752b1e12c7b7eb837d727fb4d3708bc8b18f7698a3\": container with ID starting with 60b716d027634d1d9bfd56752b1e12c7b7eb837d727fb4d3708bc8b18f7698a3 not found: ID does not exist" containerID="60b716d027634d1d9bfd56752b1e12c7b7eb837d727fb4d3708bc8b18f7698a3" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.414612 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60b716d027634d1d9bfd56752b1e12c7b7eb837d727fb4d3708bc8b18f7698a3"} err="failed to get container status \"60b716d027634d1d9bfd56752b1e12c7b7eb837d727fb4d3708bc8b18f7698a3\": rpc error: code = NotFound desc = could not find container \"60b716d027634d1d9bfd56752b1e12c7b7eb837d727fb4d3708bc8b18f7698a3\": container with ID starting with 60b716d027634d1d9bfd56752b1e12c7b7eb837d727fb4d3708bc8b18f7698a3 not found: ID does not exist" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.417528 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-b59c8fc9c-nklnq"] Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.547661 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58583d53-0add-4758-8d8b-c309a79b4c48" path="/var/lib/kubelet/pods/58583d53-0add-4758-8d8b-c309a79b4c48/volumes" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.549653 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7daca87e-5103-46bd-b6ae-7643c66a4fbc" path="/var/lib/kubelet/pods/7daca87e-5103-46bd-b6ae-7643c66a4fbc/volumes" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.387751 4713 generic.go:334] "Generic (PLEG): container finished" podID="12cdabef-a56e-45d2-8896-aab98bd84fb1" containerID="71f869c9a3deae4099eb6a9e0da68e9d0801b114263bfc45efc59f3dae8002be" exitCode=0 Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.387818 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548812-24fjw" event={"ID":"12cdabef-a56e-45d2-8896-aab98bd84fb1","Type":"ContainerDied","Data":"71f869c9a3deae4099eb6a9e0da68e9d0801b114263bfc45efc59f3dae8002be"} Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.414294 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf"] Mar 08 00:12:03 crc kubenswrapper[4713]: E0308 00:12:03.414698 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58583d53-0add-4758-8d8b-c309a79b4c48" containerName="controller-manager" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.414725 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="58583d53-0add-4758-8d8b-c309a79b4c48" containerName="controller-manager" Mar 08 00:12:03 crc kubenswrapper[4713]: E0308 00:12:03.414761 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7daca87e-5103-46bd-b6ae-7643c66a4fbc" containerName="route-controller-manager" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.414773 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="7daca87e-5103-46bd-b6ae-7643c66a4fbc" containerName="route-controller-manager" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.414950 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="7daca87e-5103-46bd-b6ae-7643c66a4fbc" containerName="route-controller-manager" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.414969 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="58583d53-0add-4758-8d8b-c309a79b4c48" containerName="controller-manager" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.415415 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.420421 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.421897 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.422018 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.422413 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.424669 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.426278 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.435083 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.444023 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445"] Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.444960 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.448257 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.448556 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.448605 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.448685 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.448576 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.449411 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14a0c57d-18d7-440f-aa59-4a55988fcd25-client-ca\") pod \"route-controller-manager-6575bb6f8c-p6445\" (UID: \"14a0c57d-18d7-440f-aa59-4a55988fcd25\") " pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.449471 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-config\") pod \"controller-manager-795f4d9bc7-g9wgf\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.449518 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14a0c57d-18d7-440f-aa59-4a55988fcd25-config\") pod \"route-controller-manager-6575bb6f8c-p6445\" (UID: \"14a0c57d-18d7-440f-aa59-4a55988fcd25\") " pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.449553 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-proxy-ca-bundles\") pod \"controller-manager-795f4d9bc7-g9wgf\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.449594 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mslqk\" (UniqueName: \"kubernetes.io/projected/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-kube-api-access-mslqk\") pod \"controller-manager-795f4d9bc7-g9wgf\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.449631 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-client-ca\") pod \"controller-manager-795f4d9bc7-g9wgf\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.449665 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z726j\" (UniqueName: \"kubernetes.io/projected/14a0c57d-18d7-440f-aa59-4a55988fcd25-kube-api-access-z726j\") pod \"route-controller-manager-6575bb6f8c-p6445\" (UID: \"14a0c57d-18d7-440f-aa59-4a55988fcd25\") " pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.449702 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-serving-cert\") pod \"controller-manager-795f4d9bc7-g9wgf\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.449853 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14a0c57d-18d7-440f-aa59-4a55988fcd25-serving-cert\") pod \"route-controller-manager-6575bb6f8c-p6445\" (UID: \"14a0c57d-18d7-440f-aa59-4a55988fcd25\") " pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.453276 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.465959 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445"] Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.472032 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf"] Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.551797 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-client-ca\") pod \"controller-manager-795f4d9bc7-g9wgf\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.551901 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z726j\" (UniqueName: \"kubernetes.io/projected/14a0c57d-18d7-440f-aa59-4a55988fcd25-kube-api-access-z726j\") pod \"route-controller-manager-6575bb6f8c-p6445\" (UID: \"14a0c57d-18d7-440f-aa59-4a55988fcd25\") " pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.551954 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-serving-cert\") pod \"controller-manager-795f4d9bc7-g9wgf\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.552014 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14a0c57d-18d7-440f-aa59-4a55988fcd25-serving-cert\") pod \"route-controller-manager-6575bb6f8c-p6445\" (UID: \"14a0c57d-18d7-440f-aa59-4a55988fcd25\") " pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.552102 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14a0c57d-18d7-440f-aa59-4a55988fcd25-client-ca\") pod \"route-controller-manager-6575bb6f8c-p6445\" (UID: \"14a0c57d-18d7-440f-aa59-4a55988fcd25\") " pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.552138 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-config\") pod \"controller-manager-795f4d9bc7-g9wgf\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.552183 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14a0c57d-18d7-440f-aa59-4a55988fcd25-config\") pod \"route-controller-manager-6575bb6f8c-p6445\" (UID: \"14a0c57d-18d7-440f-aa59-4a55988fcd25\") " pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.552222 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-proxy-ca-bundles\") pod \"controller-manager-795f4d9bc7-g9wgf\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.552270 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mslqk\" (UniqueName: \"kubernetes.io/projected/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-kube-api-access-mslqk\") pod \"controller-manager-795f4d9bc7-g9wgf\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.553112 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-client-ca\") pod \"controller-manager-795f4d9bc7-g9wgf\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.553921 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14a0c57d-18d7-440f-aa59-4a55988fcd25-client-ca\") pod \"route-controller-manager-6575bb6f8c-p6445\" (UID: \"14a0c57d-18d7-440f-aa59-4a55988fcd25\") " pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.553966 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14a0c57d-18d7-440f-aa59-4a55988fcd25-config\") pod \"route-controller-manager-6575bb6f8c-p6445\" (UID: \"14a0c57d-18d7-440f-aa59-4a55988fcd25\") " pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.554085 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-proxy-ca-bundles\") pod \"controller-manager-795f4d9bc7-g9wgf\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.556392 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-config\") pod \"controller-manager-795f4d9bc7-g9wgf\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.556547 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-serving-cert\") pod \"controller-manager-795f4d9bc7-g9wgf\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.557360 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14a0c57d-18d7-440f-aa59-4a55988fcd25-serving-cert\") pod \"route-controller-manager-6575bb6f8c-p6445\" (UID: \"14a0c57d-18d7-440f-aa59-4a55988fcd25\") " pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.567201 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z726j\" (UniqueName: \"kubernetes.io/projected/14a0c57d-18d7-440f-aa59-4a55988fcd25-kube-api-access-z726j\") pod \"route-controller-manager-6575bb6f8c-p6445\" (UID: \"14a0c57d-18d7-440f-aa59-4a55988fcd25\") " pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.575734 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mslqk\" (UniqueName: \"kubernetes.io/projected/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-kube-api-access-mslqk\") pod \"controller-manager-795f4d9bc7-g9wgf\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.754471 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.766676 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:04 crc kubenswrapper[4713]: I0308 00:12:04.182559 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445"] Mar 08 00:12:04 crc kubenswrapper[4713]: W0308 00:12:04.188185 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14a0c57d_18d7_440f_aa59_4a55988fcd25.slice/crio-d6f0dd9d549fc0ef306ade913afb52742256dcf11ce9922697ee66b5e4b3851b WatchSource:0}: Error finding container d6f0dd9d549fc0ef306ade913afb52742256dcf11ce9922697ee66b5e4b3851b: Status 404 returned error can't find the container with id d6f0dd9d549fc0ef306ade913afb52742256dcf11ce9922697ee66b5e4b3851b Mar 08 00:12:04 crc kubenswrapper[4713]: I0308 00:12:04.235292 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf"] Mar 08 00:12:04 crc kubenswrapper[4713]: W0308 00:12:04.240289 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67b526e8_eda3_4eaf_b5ed_15ed74c51d76.slice/crio-00559d3f800cb54f8f53c6d7c5f012513908b1826af624111ab30d74222442a0 WatchSource:0}: Error finding container 00559d3f800cb54f8f53c6d7c5f012513908b1826af624111ab30d74222442a0: Status 404 returned error can't find the container with id 00559d3f800cb54f8f53c6d7c5f012513908b1826af624111ab30d74222442a0 Mar 08 00:12:04 crc kubenswrapper[4713]: I0308 00:12:04.395477 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" event={"ID":"67b526e8-eda3-4eaf-b5ed-15ed74c51d76","Type":"ContainerStarted","Data":"6c658c8b4a03fefe8008b7910e27cb534da06e0671543307b2db80f93874f42f"} Mar 08 00:12:04 crc kubenswrapper[4713]: I0308 00:12:04.395795 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" event={"ID":"67b526e8-eda3-4eaf-b5ed-15ed74c51d76","Type":"ContainerStarted","Data":"00559d3f800cb54f8f53c6d7c5f012513908b1826af624111ab30d74222442a0"} Mar 08 00:12:04 crc kubenswrapper[4713]: I0308 00:12:04.395815 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:04 crc kubenswrapper[4713]: I0308 00:12:04.397240 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" event={"ID":"14a0c57d-18d7-440f-aa59-4a55988fcd25","Type":"ContainerStarted","Data":"2345286ba622f88cf52365f92bf3637004b5c8547fe7560e870332653d0ac5f6"} Mar 08 00:12:04 crc kubenswrapper[4713]: I0308 00:12:04.397285 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" event={"ID":"14a0c57d-18d7-440f-aa59-4a55988fcd25","Type":"ContainerStarted","Data":"d6f0dd9d549fc0ef306ade913afb52742256dcf11ce9922697ee66b5e4b3851b"} Mar 08 00:12:04 crc kubenswrapper[4713]: I0308 00:12:04.397631 4713 patch_prober.go:28] interesting pod/controller-manager-795f4d9bc7-g9wgf container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" start-of-body= Mar 08 00:12:04 crc kubenswrapper[4713]: I0308 00:12:04.397674 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" podUID="67b526e8-eda3-4eaf-b5ed-15ed74c51d76" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" Mar 08 00:12:04 crc kubenswrapper[4713]: I0308 00:12:04.414781 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" podStartSLOduration=3.414752659 podStartE2EDuration="3.414752659s" podCreationTimestamp="2026-03-08 00:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:12:04.413684468 +0000 UTC m=+378.533316711" watchObservedRunningTime="2026-03-08 00:12:04.414752659 +0000 UTC m=+378.534384892" Mar 08 00:12:04 crc kubenswrapper[4713]: I0308 00:12:04.623786 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548812-24fjw" Mar 08 00:12:04 crc kubenswrapper[4713]: I0308 00:12:04.766199 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ncv9\" (UniqueName: \"kubernetes.io/projected/12cdabef-a56e-45d2-8896-aab98bd84fb1-kube-api-access-6ncv9\") pod \"12cdabef-a56e-45d2-8896-aab98bd84fb1\" (UID: \"12cdabef-a56e-45d2-8896-aab98bd84fb1\") " Mar 08 00:12:04 crc kubenswrapper[4713]: I0308 00:12:04.772176 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12cdabef-a56e-45d2-8896-aab98bd84fb1-kube-api-access-6ncv9" (OuterVolumeSpecName: "kube-api-access-6ncv9") pod "12cdabef-a56e-45d2-8896-aab98bd84fb1" (UID: "12cdabef-a56e-45d2-8896-aab98bd84fb1"). InnerVolumeSpecName "kube-api-access-6ncv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:12:04 crc kubenswrapper[4713]: I0308 00:12:04.867660 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ncv9\" (UniqueName: \"kubernetes.io/projected/12cdabef-a56e-45d2-8896-aab98bd84fb1-kube-api-access-6ncv9\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:05 crc kubenswrapper[4713]: I0308 00:12:05.403422 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548812-24fjw" event={"ID":"12cdabef-a56e-45d2-8896-aab98bd84fb1","Type":"ContainerDied","Data":"4f3257c130a12b7f62d39b42bf8c076b22c12811abedce81b9b8ef554ca7f546"} Mar 08 00:12:05 crc kubenswrapper[4713]: I0308 00:12:05.403607 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f3257c130a12b7f62d39b42bf8c076b22c12811abedce81b9b8ef554ca7f546" Mar 08 00:12:05 crc kubenswrapper[4713]: I0308 00:12:05.403479 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548812-24fjw" Mar 08 00:12:05 crc kubenswrapper[4713]: I0308 00:12:05.404085 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:05 crc kubenswrapper[4713]: I0308 00:12:05.407243 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:05 crc kubenswrapper[4713]: I0308 00:12:05.411910 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:05 crc kubenswrapper[4713]: I0308 00:12:05.427567 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" podStartSLOduration=4.427547132 podStartE2EDuration="4.427547132s" podCreationTimestamp="2026-03-08 00:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:12:05.421696043 +0000 UTC m=+379.541328286" watchObservedRunningTime="2026-03-08 00:12:05.427547132 +0000 UTC m=+379.547179375" Mar 08 00:12:06 crc kubenswrapper[4713]: I0308 00:12:06.343240 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 08 00:12:23 crc kubenswrapper[4713]: I0308 00:12:23.813682 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 08 00:12:26 crc kubenswrapper[4713]: I0308 00:12:26.856894 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf"] Mar 08 00:12:26 crc kubenswrapper[4713]: I0308 00:12:26.857419 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" podUID="67b526e8-eda3-4eaf-b5ed-15ed74c51d76" containerName="controller-manager" containerID="cri-o://6c658c8b4a03fefe8008b7910e27cb534da06e0671543307b2db80f93874f42f" gracePeriod=30 Mar 08 00:12:26 crc kubenswrapper[4713]: I0308 00:12:26.870988 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445"] Mar 08 00:12:26 crc kubenswrapper[4713]: I0308 00:12:26.871221 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" podUID="14a0c57d-18d7-440f-aa59-4a55988fcd25" containerName="route-controller-manager" containerID="cri-o://2345286ba622f88cf52365f92bf3637004b5c8547fe7560e870332653d0ac5f6" gracePeriod=30 Mar 08 00:12:27 crc kubenswrapper[4713]: I0308 00:12:27.742989 4713 generic.go:334] "Generic (PLEG): container finished" podID="67b526e8-eda3-4eaf-b5ed-15ed74c51d76" containerID="6c658c8b4a03fefe8008b7910e27cb534da06e0671543307b2db80f93874f42f" exitCode=0 Mar 08 00:12:27 crc kubenswrapper[4713]: I0308 00:12:27.743071 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" event={"ID":"67b526e8-eda3-4eaf-b5ed-15ed74c51d76","Type":"ContainerDied","Data":"6c658c8b4a03fefe8008b7910e27cb534da06e0671543307b2db80f93874f42f"} Mar 08 00:12:27 crc kubenswrapper[4713]: I0308 00:12:27.745073 4713 generic.go:334] "Generic (PLEG): container finished" podID="14a0c57d-18d7-440f-aa59-4a55988fcd25" containerID="2345286ba622f88cf52365f92bf3637004b5c8547fe7560e870332653d0ac5f6" exitCode=0 Mar 08 00:12:27 crc kubenswrapper[4713]: I0308 00:12:27.745097 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" event={"ID":"14a0c57d-18d7-440f-aa59-4a55988fcd25","Type":"ContainerDied","Data":"2345286ba622f88cf52365f92bf3637004b5c8547fe7560e870332653d0ac5f6"} Mar 08 00:12:27 crc kubenswrapper[4713]: I0308 00:12:27.939680 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:27 crc kubenswrapper[4713]: I0308 00:12:27.968466 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5"] Mar 08 00:12:27 crc kubenswrapper[4713]: E0308 00:12:27.968700 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14a0c57d-18d7-440f-aa59-4a55988fcd25" containerName="route-controller-manager" Mar 08 00:12:27 crc kubenswrapper[4713]: I0308 00:12:27.968715 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a0c57d-18d7-440f-aa59-4a55988fcd25" containerName="route-controller-manager" Mar 08 00:12:27 crc kubenswrapper[4713]: E0308 00:12:27.968731 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12cdabef-a56e-45d2-8896-aab98bd84fb1" containerName="oc" Mar 08 00:12:27 crc kubenswrapper[4713]: I0308 00:12:27.968739 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="12cdabef-a56e-45d2-8896-aab98bd84fb1" containerName="oc" Mar 08 00:12:27 crc kubenswrapper[4713]: I0308 00:12:27.968880 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="12cdabef-a56e-45d2-8896-aab98bd84fb1" containerName="oc" Mar 08 00:12:27 crc kubenswrapper[4713]: I0308 00:12:27.968895 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="14a0c57d-18d7-440f-aa59-4a55988fcd25" containerName="route-controller-manager" Mar 08 00:12:27 crc kubenswrapper[4713]: I0308 00:12:27.969246 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:27 crc kubenswrapper[4713]: I0308 00:12:27.980215 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5"] Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.005252 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.126317 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-config\") pod \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.127029 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14a0c57d-18d7-440f-aa59-4a55988fcd25-config\") pod \"14a0c57d-18d7-440f-aa59-4a55988fcd25\" (UID: \"14a0c57d-18d7-440f-aa59-4a55988fcd25\") " Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.127126 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mslqk\" (UniqueName: \"kubernetes.io/projected/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-kube-api-access-mslqk\") pod \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.127175 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-serving-cert\") pod \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.127248 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z726j\" (UniqueName: \"kubernetes.io/projected/14a0c57d-18d7-440f-aa59-4a55988fcd25-kube-api-access-z726j\") pod \"14a0c57d-18d7-440f-aa59-4a55988fcd25\" (UID: \"14a0c57d-18d7-440f-aa59-4a55988fcd25\") " Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.127293 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-proxy-ca-bundles\") pod \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.127321 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14a0c57d-18d7-440f-aa59-4a55988fcd25-client-ca\") pod \"14a0c57d-18d7-440f-aa59-4a55988fcd25\" (UID: \"14a0c57d-18d7-440f-aa59-4a55988fcd25\") " Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.127342 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14a0c57d-18d7-440f-aa59-4a55988fcd25-serving-cert\") pod \"14a0c57d-18d7-440f-aa59-4a55988fcd25\" (UID: \"14a0c57d-18d7-440f-aa59-4a55988fcd25\") " Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.127375 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-client-ca\") pod \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.127529 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-client-ca\") pod \"route-controller-manager-67cccf86c6-zhfs5\" (UID: \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\") " pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.127640 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-config\") pod \"route-controller-manager-67cccf86c6-zhfs5\" (UID: \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\") " pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.127668 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-serving-cert\") pod \"route-controller-manager-67cccf86c6-zhfs5\" (UID: \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\") " pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.127692 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnk2s\" (UniqueName: \"kubernetes.io/projected/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-kube-api-access-wnk2s\") pod \"route-controller-manager-67cccf86c6-zhfs5\" (UID: \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\") " pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.128391 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-config" (OuterVolumeSpecName: "config") pod "67b526e8-eda3-4eaf-b5ed-15ed74c51d76" (UID: "67b526e8-eda3-4eaf-b5ed-15ed74c51d76"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.128464 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14a0c57d-18d7-440f-aa59-4a55988fcd25-client-ca" (OuterVolumeSpecName: "client-ca") pod "14a0c57d-18d7-440f-aa59-4a55988fcd25" (UID: "14a0c57d-18d7-440f-aa59-4a55988fcd25"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.128596 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14a0c57d-18d7-440f-aa59-4a55988fcd25-config" (OuterVolumeSpecName: "config") pod "14a0c57d-18d7-440f-aa59-4a55988fcd25" (UID: "14a0c57d-18d7-440f-aa59-4a55988fcd25"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.128759 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "67b526e8-eda3-4eaf-b5ed-15ed74c51d76" (UID: "67b526e8-eda3-4eaf-b5ed-15ed74c51d76"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.128842 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-client-ca" (OuterVolumeSpecName: "client-ca") pod "67b526e8-eda3-4eaf-b5ed-15ed74c51d76" (UID: "67b526e8-eda3-4eaf-b5ed-15ed74c51d76"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.132133 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14a0c57d-18d7-440f-aa59-4a55988fcd25-kube-api-access-z726j" (OuterVolumeSpecName: "kube-api-access-z726j") pod "14a0c57d-18d7-440f-aa59-4a55988fcd25" (UID: "14a0c57d-18d7-440f-aa59-4a55988fcd25"). InnerVolumeSpecName "kube-api-access-z726j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.132161 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a0c57d-18d7-440f-aa59-4a55988fcd25-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "14a0c57d-18d7-440f-aa59-4a55988fcd25" (UID: "14a0c57d-18d7-440f-aa59-4a55988fcd25"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.132370 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-kube-api-access-mslqk" (OuterVolumeSpecName: "kube-api-access-mslqk") pod "67b526e8-eda3-4eaf-b5ed-15ed74c51d76" (UID: "67b526e8-eda3-4eaf-b5ed-15ed74c51d76"). InnerVolumeSpecName "kube-api-access-mslqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.134481 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "67b526e8-eda3-4eaf-b5ed-15ed74c51d76" (UID: "67b526e8-eda3-4eaf-b5ed-15ed74c51d76"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.193568 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.228739 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnk2s\" (UniqueName: \"kubernetes.io/projected/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-kube-api-access-wnk2s\") pod \"route-controller-manager-67cccf86c6-zhfs5\" (UID: \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\") " pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.228862 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-client-ca\") pod \"route-controller-manager-67cccf86c6-zhfs5\" (UID: \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\") " pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.228920 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-config\") pod \"route-controller-manager-67cccf86c6-zhfs5\" (UID: \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\") " pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.228941 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-serving-cert\") pod \"route-controller-manager-67cccf86c6-zhfs5\" (UID: \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\") " pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.228984 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z726j\" (UniqueName: \"kubernetes.io/projected/14a0c57d-18d7-440f-aa59-4a55988fcd25-kube-api-access-z726j\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.228997 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.229008 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14a0c57d-18d7-440f-aa59-4a55988fcd25-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.229021 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14a0c57d-18d7-440f-aa59-4a55988fcd25-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.229032 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.229043 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.229054 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14a0c57d-18d7-440f-aa59-4a55988fcd25-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.229065 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mslqk\" (UniqueName: \"kubernetes.io/projected/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-kube-api-access-mslqk\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.229076 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.230104 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-client-ca\") pod \"route-controller-manager-67cccf86c6-zhfs5\" (UID: \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\") " pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.230172 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-config\") pod \"route-controller-manager-67cccf86c6-zhfs5\" (UID: \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\") " pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.232127 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-serving-cert\") pod \"route-controller-manager-67cccf86c6-zhfs5\" (UID: \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\") " pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.244495 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnk2s\" (UniqueName: \"kubernetes.io/projected/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-kube-api-access-wnk2s\") pod \"route-controller-manager-67cccf86c6-zhfs5\" (UID: \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\") " pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.315579 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.723195 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5"] Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.751621 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" event={"ID":"67b526e8-eda3-4eaf-b5ed-15ed74c51d76","Type":"ContainerDied","Data":"00559d3f800cb54f8f53c6d7c5f012513908b1826af624111ab30d74222442a0"} Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.751668 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.751671 4713 scope.go:117] "RemoveContainer" containerID="6c658c8b4a03fefe8008b7910e27cb534da06e0671543307b2db80f93874f42f" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.760702 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" event={"ID":"14a0c57d-18d7-440f-aa59-4a55988fcd25","Type":"ContainerDied","Data":"d6f0dd9d549fc0ef306ade913afb52742256dcf11ce9922697ee66b5e4b3851b"} Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.760817 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.765116 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" event={"ID":"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528","Type":"ContainerStarted","Data":"8f642808b84e4f9a7dbfc1946365248a00698721e0ae378e73c5caef95a3edb5"} Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.773690 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf"] Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.775749 4713 scope.go:117] "RemoveContainer" containerID="2345286ba622f88cf52365f92bf3637004b5c8547fe7560e870332653d0ac5f6" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.777905 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf"] Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.783517 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.786735 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445"] Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.789815 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445"] Mar 08 00:12:29 crc kubenswrapper[4713]: I0308 00:12:29.775316 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" event={"ID":"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528","Type":"ContainerStarted","Data":"a5ad4469ff836c615e5b2bcb96b4fe9efd7c80eb9a37dbbbc54e3aa236361f04"} Mar 08 00:12:29 crc kubenswrapper[4713]: I0308 00:12:29.775655 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:29 crc kubenswrapper[4713]: I0308 00:12:29.781918 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:29 crc kubenswrapper[4713]: I0308 00:12:29.824675 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" podStartSLOduration=3.824651296 podStartE2EDuration="3.824651296s" podCreationTimestamp="2026-03-08 00:12:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:12:29.800314418 +0000 UTC m=+403.919946701" watchObservedRunningTime="2026-03-08 00:12:29.824651296 +0000 UTC m=+403.944283549" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.549036 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14a0c57d-18d7-440f-aa59-4a55988fcd25" path="/var/lib/kubelet/pods/14a0c57d-18d7-440f-aa59-4a55988fcd25/volumes" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.549534 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67b526e8-eda3-4eaf-b5ed-15ed74c51d76" path="/var/lib/kubelet/pods/67b526e8-eda3-4eaf-b5ed-15ed74c51d76/volumes" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.686794 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5f498ddbb5-wj976"] Mar 08 00:12:30 crc kubenswrapper[4713]: E0308 00:12:30.687075 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67b526e8-eda3-4eaf-b5ed-15ed74c51d76" containerName="controller-manager" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.687092 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="67b526e8-eda3-4eaf-b5ed-15ed74c51d76" containerName="controller-manager" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.687263 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="67b526e8-eda3-4eaf-b5ed-15ed74c51d76" containerName="controller-manager" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.687712 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.689471 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.690684 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.692084 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.692239 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.695445 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.695889 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.698039 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f498ddbb5-wj976"] Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.702716 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.858777 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn4r8\" (UniqueName: \"kubernetes.io/projected/28926f2e-f630-49fa-87f7-2c82067f06cc-kube-api-access-gn4r8\") pod \"controller-manager-5f498ddbb5-wj976\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.858871 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-config\") pod \"controller-manager-5f498ddbb5-wj976\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.858920 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28926f2e-f630-49fa-87f7-2c82067f06cc-serving-cert\") pod \"controller-manager-5f498ddbb5-wj976\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.858951 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-proxy-ca-bundles\") pod \"controller-manager-5f498ddbb5-wj976\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.858988 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-client-ca\") pod \"controller-manager-5f498ddbb5-wj976\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.960231 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-client-ca\") pod \"controller-manager-5f498ddbb5-wj976\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.960335 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn4r8\" (UniqueName: \"kubernetes.io/projected/28926f2e-f630-49fa-87f7-2c82067f06cc-kube-api-access-gn4r8\") pod \"controller-manager-5f498ddbb5-wj976\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.960371 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-config\") pod \"controller-manager-5f498ddbb5-wj976\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.960438 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28926f2e-f630-49fa-87f7-2c82067f06cc-serving-cert\") pod \"controller-manager-5f498ddbb5-wj976\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.960475 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-proxy-ca-bundles\") pod \"controller-manager-5f498ddbb5-wj976\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.961865 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-proxy-ca-bundles\") pod \"controller-manager-5f498ddbb5-wj976\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.961967 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-config\") pod \"controller-manager-5f498ddbb5-wj976\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.968521 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-client-ca\") pod \"controller-manager-5f498ddbb5-wj976\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.971356 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28926f2e-f630-49fa-87f7-2c82067f06cc-serving-cert\") pod \"controller-manager-5f498ddbb5-wj976\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.977202 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn4r8\" (UniqueName: \"kubernetes.io/projected/28926f2e-f630-49fa-87f7-2c82067f06cc-kube-api-access-gn4r8\") pod \"controller-manager-5f498ddbb5-wj976\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:31 crc kubenswrapper[4713]: I0308 00:12:31.008583 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:31 crc kubenswrapper[4713]: I0308 00:12:31.408913 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f498ddbb5-wj976"] Mar 08 00:12:31 crc kubenswrapper[4713]: W0308 00:12:31.412066 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28926f2e_f630_49fa_87f7_2c82067f06cc.slice/crio-0667e551eeb85abb81e933da12494d8a43adb1cf8dd34c05e62b52f4f8685240 WatchSource:0}: Error finding container 0667e551eeb85abb81e933da12494d8a43adb1cf8dd34c05e62b52f4f8685240: Status 404 returned error can't find the container with id 0667e551eeb85abb81e933da12494d8a43adb1cf8dd34c05e62b52f4f8685240 Mar 08 00:12:31 crc kubenswrapper[4713]: I0308 00:12:31.786908 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" event={"ID":"28926f2e-f630-49fa-87f7-2c82067f06cc","Type":"ContainerStarted","Data":"765f68d8bc64d8c5a83f9e32f2b0ae7c66c88c6b731b6c17a50a000ff87ef687"} Mar 08 00:12:31 crc kubenswrapper[4713]: I0308 00:12:31.786957 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" event={"ID":"28926f2e-f630-49fa-87f7-2c82067f06cc","Type":"ContainerStarted","Data":"0667e551eeb85abb81e933da12494d8a43adb1cf8dd34c05e62b52f4f8685240"} Mar 08 00:12:32 crc kubenswrapper[4713]: I0308 00:12:32.793560 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:32 crc kubenswrapper[4713]: I0308 00:12:32.798054 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:32 crc kubenswrapper[4713]: I0308 00:12:32.815759 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" podStartSLOduration=6.8157424429999995 podStartE2EDuration="6.815742443s" podCreationTimestamp="2026-03-08 00:12:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:12:31.813134469 +0000 UTC m=+405.932766692" watchObservedRunningTime="2026-03-08 00:12:32.815742443 +0000 UTC m=+406.935374676" Mar 08 00:12:38 crc kubenswrapper[4713]: I0308 00:12:38.570549 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pd9br"] Mar 08 00:12:38 crc kubenswrapper[4713]: I0308 00:12:38.571622 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pd9br" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" containerName="registry-server" containerID="cri-o://a032630e16097c96141079adebfc1092e90366030a54b1b60ed4f6c7681a4c79" gracePeriod=2 Mar 08 00:12:38 crc kubenswrapper[4713]: I0308 00:12:38.705609 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x7pkf"] Mar 08 00:12:38 crc kubenswrapper[4713]: I0308 00:12:38.705848 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x7pkf" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" containerName="registry-server" containerID="cri-o://54d94291bba3da410042a68b46eeee3f18e230b96de2843a430f6d4aa0771496" gracePeriod=2 Mar 08 00:12:38 crc kubenswrapper[4713]: I0308 00:12:38.965480 4713 generic.go:334] "Generic (PLEG): container finished" podID="c33b42a1-bf95-490f-a907-765855ec81d1" containerID="54d94291bba3da410042a68b46eeee3f18e230b96de2843a430f6d4aa0771496" exitCode=0 Mar 08 00:12:38 crc kubenswrapper[4713]: I0308 00:12:38.965688 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x7pkf" event={"ID":"c33b42a1-bf95-490f-a907-765855ec81d1","Type":"ContainerDied","Data":"54d94291bba3da410042a68b46eeee3f18e230b96de2843a430f6d4aa0771496"} Mar 08 00:12:38 crc kubenswrapper[4713]: I0308 00:12:38.969651 4713 generic.go:334] "Generic (PLEG): container finished" podID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" containerID="a032630e16097c96141079adebfc1092e90366030a54b1b60ed4f6c7681a4c79" exitCode=0 Mar 08 00:12:38 crc kubenswrapper[4713]: I0308 00:12:38.969678 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pd9br" event={"ID":"cd4a956b-6edb-436e-bd5e-5d57899c2ea1","Type":"ContainerDied","Data":"a032630e16097c96141079adebfc1092e90366030a54b1b60ed4f6c7681a4c79"} Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.054692 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.162403 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-catalog-content\") pod \"cd4a956b-6edb-436e-bd5e-5d57899c2ea1\" (UID: \"cd4a956b-6edb-436e-bd5e-5d57899c2ea1\") " Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.162485 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t4bc\" (UniqueName: \"kubernetes.io/projected/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-kube-api-access-9t4bc\") pod \"cd4a956b-6edb-436e-bd5e-5d57899c2ea1\" (UID: \"cd4a956b-6edb-436e-bd5e-5d57899c2ea1\") " Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.162552 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-utilities\") pod \"cd4a956b-6edb-436e-bd5e-5d57899c2ea1\" (UID: \"cd4a956b-6edb-436e-bd5e-5d57899c2ea1\") " Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.163681 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-utilities" (OuterVolumeSpecName: "utilities") pod "cd4a956b-6edb-436e-bd5e-5d57899c2ea1" (UID: "cd4a956b-6edb-436e-bd5e-5d57899c2ea1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.166580 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.168620 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-kube-api-access-9t4bc" (OuterVolumeSpecName: "kube-api-access-9t4bc") pod "cd4a956b-6edb-436e-bd5e-5d57899c2ea1" (UID: "cd4a956b-6edb-436e-bd5e-5d57899c2ea1"). InnerVolumeSpecName "kube-api-access-9t4bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.232881 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd4a956b-6edb-436e-bd5e-5d57899c2ea1" (UID: "cd4a956b-6edb-436e-bd5e-5d57899c2ea1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.263813 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.263857 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t4bc\" (UniqueName: \"kubernetes.io/projected/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-kube-api-access-9t4bc\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.263871 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.365040 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bjqb\" (UniqueName: \"kubernetes.io/projected/c33b42a1-bf95-490f-a907-765855ec81d1-kube-api-access-7bjqb\") pod \"c33b42a1-bf95-490f-a907-765855ec81d1\" (UID: \"c33b42a1-bf95-490f-a907-765855ec81d1\") " Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.365344 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c33b42a1-bf95-490f-a907-765855ec81d1-utilities\") pod \"c33b42a1-bf95-490f-a907-765855ec81d1\" (UID: \"c33b42a1-bf95-490f-a907-765855ec81d1\") " Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.365432 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c33b42a1-bf95-490f-a907-765855ec81d1-catalog-content\") pod \"c33b42a1-bf95-490f-a907-765855ec81d1\" (UID: \"c33b42a1-bf95-490f-a907-765855ec81d1\") " Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.366673 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c33b42a1-bf95-490f-a907-765855ec81d1-utilities" (OuterVolumeSpecName: "utilities") pod "c33b42a1-bf95-490f-a907-765855ec81d1" (UID: "c33b42a1-bf95-490f-a907-765855ec81d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.368443 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c33b42a1-bf95-490f-a907-765855ec81d1-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.369558 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c33b42a1-bf95-490f-a907-765855ec81d1-kube-api-access-7bjqb" (OuterVolumeSpecName: "kube-api-access-7bjqb") pod "c33b42a1-bf95-490f-a907-765855ec81d1" (UID: "c33b42a1-bf95-490f-a907-765855ec81d1"). InnerVolumeSpecName "kube-api-access-7bjqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.420185 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c33b42a1-bf95-490f-a907-765855ec81d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c33b42a1-bf95-490f-a907-765855ec81d1" (UID: "c33b42a1-bf95-490f-a907-765855ec81d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.469729 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bjqb\" (UniqueName: \"kubernetes.io/projected/c33b42a1-bf95-490f-a907-765855ec81d1-kube-api-access-7bjqb\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.469762 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c33b42a1-bf95-490f-a907-765855ec81d1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.976510 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pd9br" event={"ID":"cd4a956b-6edb-436e-bd5e-5d57899c2ea1","Type":"ContainerDied","Data":"135e656a965d1b87bbb089b3e89dbd03d0497fd3df39d718203e4d15ec7454b9"} Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.976869 4713 scope.go:117] "RemoveContainer" containerID="a032630e16097c96141079adebfc1092e90366030a54b1b60ed4f6c7681a4c79" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.976536 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.979805 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x7pkf" event={"ID":"c33b42a1-bf95-490f-a907-765855ec81d1","Type":"ContainerDied","Data":"8b84966b96c0ed6376bfb58ebe4d50727b2f7c4a888ad1b3e8b431d7574ba8b4"} Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.979951 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.997601 4713 scope.go:117] "RemoveContainer" containerID="c2bf098434bfcc867c8195b8c42297c739230b688ab856c67dbf7a34e9987066" Mar 08 00:12:40 crc kubenswrapper[4713]: I0308 00:12:40.007950 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pd9br"] Mar 08 00:12:40 crc kubenswrapper[4713]: I0308 00:12:40.014281 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pd9br"] Mar 08 00:12:40 crc kubenswrapper[4713]: I0308 00:12:40.020734 4713 scope.go:117] "RemoveContainer" containerID="10f6a682f68f33f52b960986a98e4b9b4d5d737c5be6429ad3ce071e85a28622" Mar 08 00:12:40 crc kubenswrapper[4713]: I0308 00:12:40.027876 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x7pkf"] Mar 08 00:12:40 crc kubenswrapper[4713]: I0308 00:12:40.031538 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x7pkf"] Mar 08 00:12:40 crc kubenswrapper[4713]: I0308 00:12:40.037790 4713 scope.go:117] "RemoveContainer" containerID="54d94291bba3da410042a68b46eeee3f18e230b96de2843a430f6d4aa0771496" Mar 08 00:12:40 crc kubenswrapper[4713]: I0308 00:12:40.056250 4713 scope.go:117] "RemoveContainer" containerID="208d6f7268d01f9f7e50afe48b84246d8fc86cf25d817c7b3ce1701103741603" Mar 08 00:12:40 crc kubenswrapper[4713]: I0308 00:12:40.069424 4713 scope.go:117] "RemoveContainer" containerID="f219be814b1ac8475a83125ee5f48f62c739076f91025a6595fb3c6cc2132578" Mar 08 00:12:40 crc kubenswrapper[4713]: I0308 00:12:40.306451 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hs88q"] Mar 08 00:12:40 crc kubenswrapper[4713]: I0308 00:12:40.306680 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hs88q" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" containerName="registry-server" containerID="cri-o://023ca4eb6026d184356661b957d297149cfe69e644ecd5ceb7a20eb3c76a9016" gracePeriod=2 Mar 08 00:12:40 crc kubenswrapper[4713]: I0308 00:12:40.548708 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" path="/var/lib/kubelet/pods/c33b42a1-bf95-490f-a907-765855ec81d1/volumes" Mar 08 00:12:40 crc kubenswrapper[4713]: I0308 00:12:40.549447 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" path="/var/lib/kubelet/pods/cd4a956b-6edb-436e-bd5e-5d57899c2ea1/volumes" Mar 08 00:12:40 crc kubenswrapper[4713]: I0308 00:12:40.989081 4713 generic.go:334] "Generic (PLEG): container finished" podID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" containerID="023ca4eb6026d184356661b957d297149cfe69e644ecd5ceb7a20eb3c76a9016" exitCode=0 Mar 08 00:12:40 crc kubenswrapper[4713]: I0308 00:12:40.989166 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hs88q" event={"ID":"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0","Type":"ContainerDied","Data":"023ca4eb6026d184356661b957d297149cfe69e644ecd5ceb7a20eb3c76a9016"} Mar 08 00:12:41 crc kubenswrapper[4713]: I0308 00:12:41.297150 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:12:41 crc kubenswrapper[4713]: I0308 00:12:41.305582 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rdgpc"] Mar 08 00:12:41 crc kubenswrapper[4713]: I0308 00:12:41.305788 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rdgpc" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" containerName="registry-server" containerID="cri-o://bd4a8e19339f53886f8e1f05d3792cb1bb29da3b9e4c6bc029a48012b0bfe269" gracePeriod=2 Mar 08 00:12:41 crc kubenswrapper[4713]: I0308 00:12:41.471952 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5f498ddbb5-wj976"] Mar 08 00:12:41 crc kubenswrapper[4713]: I0308 00:12:41.472427 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" podUID="28926f2e-f630-49fa-87f7-2c82067f06cc" containerName="controller-manager" containerID="cri-o://765f68d8bc64d8c5a83f9e32f2b0ae7c66c88c6b731b6c17a50a000ff87ef687" gracePeriod=30 Mar 08 00:12:41 crc kubenswrapper[4713]: I0308 00:12:41.492155 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-catalog-content\") pod \"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0\" (UID: \"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0\") " Mar 08 00:12:41 crc kubenswrapper[4713]: I0308 00:12:41.492310 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-utilities\") pod \"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0\" (UID: \"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0\") " Mar 08 00:12:41 crc kubenswrapper[4713]: I0308 00:12:41.492373 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxjck\" (UniqueName: \"kubernetes.io/projected/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-kube-api-access-sxjck\") pod \"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0\" (UID: \"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0\") " Mar 08 00:12:41 crc kubenswrapper[4713]: I0308 00:12:41.493452 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-utilities" (OuterVolumeSpecName: "utilities") pod "2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" (UID: "2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:12:41 crc kubenswrapper[4713]: I0308 00:12:41.497926 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-kube-api-access-sxjck" (OuterVolumeSpecName: "kube-api-access-sxjck") pod "2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" (UID: "2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0"). InnerVolumeSpecName "kube-api-access-sxjck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:12:41 crc kubenswrapper[4713]: I0308 00:12:41.516723 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" (UID: "2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:12:41 crc kubenswrapper[4713]: I0308 00:12:41.570753 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5"] Mar 08 00:12:41 crc kubenswrapper[4713]: I0308 00:12:41.570958 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" podUID="5c2c4a52-cb5b-4da1-9c2b-1bb839c14528" containerName="route-controller-manager" containerID="cri-o://a5ad4469ff836c615e5b2bcb96b4fe9efd7c80eb9a37dbbbc54e3aa236361f04" gracePeriod=30 Mar 08 00:12:41 crc kubenswrapper[4713]: I0308 00:12:41.593744 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:41 crc kubenswrapper[4713]: I0308 00:12:41.593780 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxjck\" (UniqueName: \"kubernetes.io/projected/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-kube-api-access-sxjck\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:41 crc kubenswrapper[4713]: I0308 00:12:41.593790 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.003871 4713 generic.go:334] "Generic (PLEG): container finished" podID="5c2c4a52-cb5b-4da1-9c2b-1bb839c14528" containerID="a5ad4469ff836c615e5b2bcb96b4fe9efd7c80eb9a37dbbbc54e3aa236361f04" exitCode=0 Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.003926 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" event={"ID":"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528","Type":"ContainerDied","Data":"a5ad4469ff836c615e5b2bcb96b4fe9efd7c80eb9a37dbbbc54e3aa236361f04"} Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.004164 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" event={"ID":"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528","Type":"ContainerDied","Data":"8f642808b84e4f9a7dbfc1946365248a00698721e0ae378e73c5caef95a3edb5"} Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.004178 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f642808b84e4f9a7dbfc1946365248a00698721e0ae378e73c5caef95a3edb5" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.004562 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.007424 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hs88q" event={"ID":"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0","Type":"ContainerDied","Data":"6fcd739b02f335d950276fc5d35bedd4422940f74a80db12ae1da2ebc8d7061a"} Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.007466 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.007490 4713 scope.go:117] "RemoveContainer" containerID="023ca4eb6026d184356661b957d297149cfe69e644ecd5ceb7a20eb3c76a9016" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.016736 4713 generic.go:334] "Generic (PLEG): container finished" podID="28926f2e-f630-49fa-87f7-2c82067f06cc" containerID="765f68d8bc64d8c5a83f9e32f2b0ae7c66c88c6b731b6c17a50a000ff87ef687" exitCode=0 Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.016873 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" event={"ID":"28926f2e-f630-49fa-87f7-2c82067f06cc","Type":"ContainerDied","Data":"765f68d8bc64d8c5a83f9e32f2b0ae7c66c88c6b731b6c17a50a000ff87ef687"} Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.016907 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" event={"ID":"28926f2e-f630-49fa-87f7-2c82067f06cc","Type":"ContainerDied","Data":"0667e551eeb85abb81e933da12494d8a43adb1cf8dd34c05e62b52f4f8685240"} Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.017011 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.023331 4713 generic.go:334] "Generic (PLEG): container finished" podID="dcde95f7-8814-4319-8a48-6d186de5f51f" containerID="bd4a8e19339f53886f8e1f05d3792cb1bb29da3b9e4c6bc029a48012b0bfe269" exitCode=0 Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.023395 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rdgpc" event={"ID":"dcde95f7-8814-4319-8a48-6d186de5f51f","Type":"ContainerDied","Data":"bd4a8e19339f53886f8e1f05d3792cb1bb29da3b9e4c6bc029a48012b0bfe269"} Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.023589 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.037537 4713 scope.go:117] "RemoveContainer" containerID="f5743c83cf849ed0707f05f9170f67beed9226bd36833eb3fea5238d2ff525b8" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.079009 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hs88q"] Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.081923 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hs88q"] Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.093184 4713 scope.go:117] "RemoveContainer" containerID="30fcbfe0635451c7fd3955c62a769f92ccede7936e36fa38580a85369fc7d85d" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.120501 4713 scope.go:117] "RemoveContainer" containerID="765f68d8bc64d8c5a83f9e32f2b0ae7c66c88c6b731b6c17a50a000ff87ef687" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.133274 4713 scope.go:117] "RemoveContainer" containerID="765f68d8bc64d8c5a83f9e32f2b0ae7c66c88c6b731b6c17a50a000ff87ef687" Mar 08 00:12:42 crc kubenswrapper[4713]: E0308 00:12:42.134166 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"765f68d8bc64d8c5a83f9e32f2b0ae7c66c88c6b731b6c17a50a000ff87ef687\": container with ID starting with 765f68d8bc64d8c5a83f9e32f2b0ae7c66c88c6b731b6c17a50a000ff87ef687 not found: ID does not exist" containerID="765f68d8bc64d8c5a83f9e32f2b0ae7c66c88c6b731b6c17a50a000ff87ef687" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.134198 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"765f68d8bc64d8c5a83f9e32f2b0ae7c66c88c6b731b6c17a50a000ff87ef687"} err="failed to get container status \"765f68d8bc64d8c5a83f9e32f2b0ae7c66c88c6b731b6c17a50a000ff87ef687\": rpc error: code = NotFound desc = could not find container \"765f68d8bc64d8c5a83f9e32f2b0ae7c66c88c6b731b6c17a50a000ff87ef687\": container with ID starting with 765f68d8bc64d8c5a83f9e32f2b0ae7c66c88c6b731b6c17a50a000ff87ef687 not found: ID does not exist" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.195423 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.200368 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-config\") pod \"28926f2e-f630-49fa-87f7-2c82067f06cc\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.200435 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-serving-cert\") pod \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\" (UID: \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\") " Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.200453 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-proxy-ca-bundles\") pod \"28926f2e-f630-49fa-87f7-2c82067f06cc\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.200473 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn4r8\" (UniqueName: \"kubernetes.io/projected/28926f2e-f630-49fa-87f7-2c82067f06cc-kube-api-access-gn4r8\") pod \"28926f2e-f630-49fa-87f7-2c82067f06cc\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.200543 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-client-ca\") pod \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\" (UID: \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\") " Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.200566 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28926f2e-f630-49fa-87f7-2c82067f06cc-serving-cert\") pod \"28926f2e-f630-49fa-87f7-2c82067f06cc\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.200595 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-config\") pod \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\" (UID: \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\") " Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.200623 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnk2s\" (UniqueName: \"kubernetes.io/projected/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-kube-api-access-wnk2s\") pod \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\" (UID: \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\") " Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.200640 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-client-ca\") pod \"28926f2e-f630-49fa-87f7-2c82067f06cc\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.201615 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "28926f2e-f630-49fa-87f7-2c82067f06cc" (UID: "28926f2e-f630-49fa-87f7-2c82067f06cc"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.201747 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-config" (OuterVolumeSpecName: "config") pod "28926f2e-f630-49fa-87f7-2c82067f06cc" (UID: "28926f2e-f630-49fa-87f7-2c82067f06cc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.201887 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-config" (OuterVolumeSpecName: "config") pod "5c2c4a52-cb5b-4da1-9c2b-1bb839c14528" (UID: "5c2c4a52-cb5b-4da1-9c2b-1bb839c14528"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.201962 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-client-ca" (OuterVolumeSpecName: "client-ca") pod "28926f2e-f630-49fa-87f7-2c82067f06cc" (UID: "28926f2e-f630-49fa-87f7-2c82067f06cc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.202187 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-client-ca" (OuterVolumeSpecName: "client-ca") pod "5c2c4a52-cb5b-4da1-9c2b-1bb839c14528" (UID: "5c2c4a52-cb5b-4da1-9c2b-1bb839c14528"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.205850 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-kube-api-access-wnk2s" (OuterVolumeSpecName: "kube-api-access-wnk2s") pod "5c2c4a52-cb5b-4da1-9c2b-1bb839c14528" (UID: "5c2c4a52-cb5b-4da1-9c2b-1bb839c14528"). InnerVolumeSpecName "kube-api-access-wnk2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.206103 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28926f2e-f630-49fa-87f7-2c82067f06cc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "28926f2e-f630-49fa-87f7-2c82067f06cc" (UID: "28926f2e-f630-49fa-87f7-2c82067f06cc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.206249 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5c2c4a52-cb5b-4da1-9c2b-1bb839c14528" (UID: "5c2c4a52-cb5b-4da1-9c2b-1bb839c14528"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.206990 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28926f2e-f630-49fa-87f7-2c82067f06cc-kube-api-access-gn4r8" (OuterVolumeSpecName: "kube-api-access-gn4r8") pod "28926f2e-f630-49fa-87f7-2c82067f06cc" (UID: "28926f2e-f630-49fa-87f7-2c82067f06cc"). InnerVolumeSpecName "kube-api-access-gn4r8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.302864 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmk7f\" (UniqueName: \"kubernetes.io/projected/dcde95f7-8814-4319-8a48-6d186de5f51f-kube-api-access-nmk7f\") pod \"dcde95f7-8814-4319-8a48-6d186de5f51f\" (UID: \"dcde95f7-8814-4319-8a48-6d186de5f51f\") " Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.302924 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcde95f7-8814-4319-8a48-6d186de5f51f-catalog-content\") pod \"dcde95f7-8814-4319-8a48-6d186de5f51f\" (UID: \"dcde95f7-8814-4319-8a48-6d186de5f51f\") " Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.303026 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcde95f7-8814-4319-8a48-6d186de5f51f-utilities\") pod \"dcde95f7-8814-4319-8a48-6d186de5f51f\" (UID: \"dcde95f7-8814-4319-8a48-6d186de5f51f\") " Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.303334 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.303365 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28926f2e-f630-49fa-87f7-2c82067f06cc-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.303383 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.303396 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnk2s\" (UniqueName: \"kubernetes.io/projected/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-kube-api-access-wnk2s\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.303409 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.303420 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.303431 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.303443 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.303454 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn4r8\" (UniqueName: \"kubernetes.io/projected/28926f2e-f630-49fa-87f7-2c82067f06cc-kube-api-access-gn4r8\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.304003 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcde95f7-8814-4319-8a48-6d186de5f51f-utilities" (OuterVolumeSpecName: "utilities") pod "dcde95f7-8814-4319-8a48-6d186de5f51f" (UID: "dcde95f7-8814-4319-8a48-6d186de5f51f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.306085 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcde95f7-8814-4319-8a48-6d186de5f51f-kube-api-access-nmk7f" (OuterVolumeSpecName: "kube-api-access-nmk7f") pod "dcde95f7-8814-4319-8a48-6d186de5f51f" (UID: "dcde95f7-8814-4319-8a48-6d186de5f51f"). InnerVolumeSpecName "kube-api-access-nmk7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.355239 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5f498ddbb5-wj976"] Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.358159 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5f498ddbb5-wj976"] Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.404395 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcde95f7-8814-4319-8a48-6d186de5f51f-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.404426 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmk7f\" (UniqueName: \"kubernetes.io/projected/dcde95f7-8814-4319-8a48-6d186de5f51f-kube-api-access-nmk7f\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.430398 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcde95f7-8814-4319-8a48-6d186de5f51f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dcde95f7-8814-4319-8a48-6d186de5f51f" (UID: "dcde95f7-8814-4319-8a48-6d186de5f51f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.505769 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcde95f7-8814-4319-8a48-6d186de5f51f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.548331 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28926f2e-f630-49fa-87f7-2c82067f06cc" path="/var/lib/kubelet/pods/28926f2e-f630-49fa-87f7-2c82067f06cc/volumes" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.549118 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" path="/var/lib/kubelet/pods/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0/volumes" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943148 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-854bb687b5-6d9zw"] Mar 08 00:12:42 crc kubenswrapper[4713]: E0308 00:12:42.943425 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" containerName="extract-utilities" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943440 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" containerName="extract-utilities" Mar 08 00:12:42 crc kubenswrapper[4713]: E0308 00:12:42.943453 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" containerName="extract-content" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943462 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" containerName="extract-content" Mar 08 00:12:42 crc kubenswrapper[4713]: E0308 00:12:42.943476 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" containerName="extract-content" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943484 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" containerName="extract-content" Mar 08 00:12:42 crc kubenswrapper[4713]: E0308 00:12:42.943498 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c2c4a52-cb5b-4da1-9c2b-1bb839c14528" containerName="route-controller-manager" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943508 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c2c4a52-cb5b-4da1-9c2b-1bb839c14528" containerName="route-controller-manager" Mar 08 00:12:42 crc kubenswrapper[4713]: E0308 00:12:42.943519 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" containerName="registry-server" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943528 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" containerName="registry-server" Mar 08 00:12:42 crc kubenswrapper[4713]: E0308 00:12:42.943539 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" containerName="registry-server" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943548 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" containerName="registry-server" Mar 08 00:12:42 crc kubenswrapper[4713]: E0308 00:12:42.943559 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" containerName="extract-content" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943567 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" containerName="extract-content" Mar 08 00:12:42 crc kubenswrapper[4713]: E0308 00:12:42.943580 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" containerName="extract-utilities" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943588 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" containerName="extract-utilities" Mar 08 00:12:42 crc kubenswrapper[4713]: E0308 00:12:42.943602 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28926f2e-f630-49fa-87f7-2c82067f06cc" containerName="controller-manager" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943610 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="28926f2e-f630-49fa-87f7-2c82067f06cc" containerName="controller-manager" Mar 08 00:12:42 crc kubenswrapper[4713]: E0308 00:12:42.943624 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" containerName="registry-server" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943634 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" containerName="registry-server" Mar 08 00:12:42 crc kubenswrapper[4713]: E0308 00:12:42.943647 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" containerName="registry-server" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943655 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" containerName="registry-server" Mar 08 00:12:42 crc kubenswrapper[4713]: E0308 00:12:42.943666 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" containerName="extract-utilities" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943675 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" containerName="extract-utilities" Mar 08 00:12:42 crc kubenswrapper[4713]: E0308 00:12:42.943685 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" containerName="extract-content" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943693 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" containerName="extract-content" Mar 08 00:12:42 crc kubenswrapper[4713]: E0308 00:12:42.943703 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" containerName="extract-utilities" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943711 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" containerName="extract-utilities" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943845 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" containerName="registry-server" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943859 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="28926f2e-f630-49fa-87f7-2c82067f06cc" containerName="controller-manager" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943879 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c2c4a52-cb5b-4da1-9c2b-1bb839c14528" containerName="route-controller-manager" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943891 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" containerName="registry-server" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943901 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" containerName="registry-server" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943913 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" containerName="registry-server" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.944361 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.947571 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.949386 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.949703 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.950118 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.950331 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.952180 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.952273 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd"] Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.953017 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.958254 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.961288 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-854bb687b5-6d9zw"] Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.965026 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd"] Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.035906 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rdgpc" event={"ID":"dcde95f7-8814-4319-8a48-6d186de5f51f","Type":"ContainerDied","Data":"ef8b074d9efbef9bd1985cd1c77849aac1a6142c1203709657b5b6f697605e4e"} Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.035952 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.035970 4713 scope.go:117] "RemoveContainer" containerID="bd4a8e19339f53886f8e1f05d3792cb1bb29da3b9e4c6bc029a48012b0bfe269" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.038479 4713 generic.go:334] "Generic (PLEG): container finished" podID="2ab8d84d-9110-4bed-8288-4764d7c10f74" containerID="f9566defd908e4b2b14ead5994a9afb7bc984f75e3c8235a78747cca1c95babf" exitCode=0 Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.038544 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.038532 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29548800-ghv4d" event={"ID":"2ab8d84d-9110-4bed-8288-4764d7c10f74","Type":"ContainerDied","Data":"f9566defd908e4b2b14ead5994a9afb7bc984f75e3c8235a78747cca1c95babf"} Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.052142 4713 scope.go:117] "RemoveContainer" containerID="811a7fecc13f433a775d8c8b046af8802008222a2688bfa3140a6cccdba2f8bb" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.058738 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rdgpc"] Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.063786 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rdgpc"] Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.079982 4713 scope.go:117] "RemoveContainer" containerID="eb31791b33621b563ffdcd2c2e41bd769a0b407d0d7cbd536956a89ac412d5bb" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.089449 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5"] Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.095778 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5"] Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.112868 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pnqj\" (UniqueName: \"kubernetes.io/projected/d80407f9-98a7-488a-aba0-f718da170a35-kube-api-access-6pnqj\") pod \"controller-manager-854bb687b5-6d9zw\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.112923 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb5vf\" (UniqueName: \"kubernetes.io/projected/97ffa397-8c2d-4614-81c8-f0bd196db252-kube-api-access-wb5vf\") pod \"route-controller-manager-86cddb879c-x9ppd\" (UID: \"97ffa397-8c2d-4614-81c8-f0bd196db252\") " pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.112970 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97ffa397-8c2d-4614-81c8-f0bd196db252-client-ca\") pod \"route-controller-manager-86cddb879c-x9ppd\" (UID: \"97ffa397-8c2d-4614-81c8-f0bd196db252\") " pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.113103 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-client-ca\") pod \"controller-manager-854bb687b5-6d9zw\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.113148 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-proxy-ca-bundles\") pod \"controller-manager-854bb687b5-6d9zw\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.113173 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d80407f9-98a7-488a-aba0-f718da170a35-serving-cert\") pod \"controller-manager-854bb687b5-6d9zw\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.113233 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97ffa397-8c2d-4614-81c8-f0bd196db252-config\") pod \"route-controller-manager-86cddb879c-x9ppd\" (UID: \"97ffa397-8c2d-4614-81c8-f0bd196db252\") " pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.113309 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97ffa397-8c2d-4614-81c8-f0bd196db252-serving-cert\") pod \"route-controller-manager-86cddb879c-x9ppd\" (UID: \"97ffa397-8c2d-4614-81c8-f0bd196db252\") " pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.113334 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-config\") pod \"controller-manager-854bb687b5-6d9zw\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.214027 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-client-ca\") pod \"controller-manager-854bb687b5-6d9zw\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.214085 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-proxy-ca-bundles\") pod \"controller-manager-854bb687b5-6d9zw\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.214110 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d80407f9-98a7-488a-aba0-f718da170a35-serving-cert\") pod \"controller-manager-854bb687b5-6d9zw\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.214137 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97ffa397-8c2d-4614-81c8-f0bd196db252-config\") pod \"route-controller-manager-86cddb879c-x9ppd\" (UID: \"97ffa397-8c2d-4614-81c8-f0bd196db252\") " pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.214187 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97ffa397-8c2d-4614-81c8-f0bd196db252-serving-cert\") pod \"route-controller-manager-86cddb879c-x9ppd\" (UID: \"97ffa397-8c2d-4614-81c8-f0bd196db252\") " pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.214210 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-config\") pod \"controller-manager-854bb687b5-6d9zw\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.214241 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pnqj\" (UniqueName: \"kubernetes.io/projected/d80407f9-98a7-488a-aba0-f718da170a35-kube-api-access-6pnqj\") pod \"controller-manager-854bb687b5-6d9zw\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.214272 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb5vf\" (UniqueName: \"kubernetes.io/projected/97ffa397-8c2d-4614-81c8-f0bd196db252-kube-api-access-wb5vf\") pod \"route-controller-manager-86cddb879c-x9ppd\" (UID: \"97ffa397-8c2d-4614-81c8-f0bd196db252\") " pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.214305 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97ffa397-8c2d-4614-81c8-f0bd196db252-client-ca\") pod \"route-controller-manager-86cddb879c-x9ppd\" (UID: \"97ffa397-8c2d-4614-81c8-f0bd196db252\") " pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.215359 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-client-ca\") pod \"controller-manager-854bb687b5-6d9zw\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.215473 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97ffa397-8c2d-4614-81c8-f0bd196db252-client-ca\") pod \"route-controller-manager-86cddb879c-x9ppd\" (UID: \"97ffa397-8c2d-4614-81c8-f0bd196db252\") " pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.216433 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97ffa397-8c2d-4614-81c8-f0bd196db252-config\") pod \"route-controller-manager-86cddb879c-x9ppd\" (UID: \"97ffa397-8c2d-4614-81c8-f0bd196db252\") " pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.217308 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-proxy-ca-bundles\") pod \"controller-manager-854bb687b5-6d9zw\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.218540 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d80407f9-98a7-488a-aba0-f718da170a35-serving-cert\") pod \"controller-manager-854bb687b5-6d9zw\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.218611 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97ffa397-8c2d-4614-81c8-f0bd196db252-serving-cert\") pod \"route-controller-manager-86cddb879c-x9ppd\" (UID: \"97ffa397-8c2d-4614-81c8-f0bd196db252\") " pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.234402 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pnqj\" (UniqueName: \"kubernetes.io/projected/d80407f9-98a7-488a-aba0-f718da170a35-kube-api-access-6pnqj\") pod \"controller-manager-854bb687b5-6d9zw\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.243197 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb5vf\" (UniqueName: \"kubernetes.io/projected/97ffa397-8c2d-4614-81c8-f0bd196db252-kube-api-access-wb5vf\") pod \"route-controller-manager-86cddb879c-x9ppd\" (UID: \"97ffa397-8c2d-4614-81c8-f0bd196db252\") " pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.276639 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.294416 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-config\") pod \"controller-manager-854bb687b5-6d9zw\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.569801 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.695495 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd"] Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.010195 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-854bb687b5-6d9zw"] Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.047874 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" event={"ID":"97ffa397-8c2d-4614-81c8-f0bd196db252","Type":"ContainerStarted","Data":"e77a05b39ea8975e4c9eb1dc5876f187e0cb360fa48f54f8bb6ea89f77ca58a4"} Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.048238 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.048254 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" event={"ID":"97ffa397-8c2d-4614-81c8-f0bd196db252","Type":"ContainerStarted","Data":"f98a407edc02834035ff48f1d7184aacc2041ec72127750f74d7bd1587b0b9d2"} Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.049148 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" event={"ID":"d80407f9-98a7-488a-aba0-f718da170a35","Type":"ContainerStarted","Data":"a828f2f69d7e7e4fb80afb9cc983c532df289af847177c4e0ec6d1fbe997c392"} Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.050148 4713 patch_prober.go:28] interesting pod/route-controller-manager-86cddb879c-x9ppd container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.70:8443/healthz\": dial tcp 10.217.0.70:8443: connect: connection refused" start-of-body= Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.050179 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" podUID="97ffa397-8c2d-4614-81c8-f0bd196db252" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.70:8443/healthz\": dial tcp 10.217.0.70:8443: connect: connection refused" Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.067647 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" podStartSLOduration=3.067624726 podStartE2EDuration="3.067624726s" podCreationTimestamp="2026-03-08 00:12:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:12:44.065748847 +0000 UTC m=+418.185381080" watchObservedRunningTime="2026-03-08 00:12:44.067624726 +0000 UTC m=+418.187256969" Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.268270 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29548800-ghv4d" Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.430548 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtmqw\" (UniqueName: \"kubernetes.io/projected/2ab8d84d-9110-4bed-8288-4764d7c10f74-kube-api-access-rtmqw\") pod \"2ab8d84d-9110-4bed-8288-4764d7c10f74\" (UID: \"2ab8d84d-9110-4bed-8288-4764d7c10f74\") " Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.430602 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2ab8d84d-9110-4bed-8288-4764d7c10f74-serviceca\") pod \"2ab8d84d-9110-4bed-8288-4764d7c10f74\" (UID: \"2ab8d84d-9110-4bed-8288-4764d7c10f74\") " Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.431580 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ab8d84d-9110-4bed-8288-4764d7c10f74-serviceca" (OuterVolumeSpecName: "serviceca") pod "2ab8d84d-9110-4bed-8288-4764d7c10f74" (UID: "2ab8d84d-9110-4bed-8288-4764d7c10f74"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.438405 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ab8d84d-9110-4bed-8288-4764d7c10f74-kube-api-access-rtmqw" (OuterVolumeSpecName: "kube-api-access-rtmqw") pod "2ab8d84d-9110-4bed-8288-4764d7c10f74" (UID: "2ab8d84d-9110-4bed-8288-4764d7c10f74"). InnerVolumeSpecName "kube-api-access-rtmqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.531997 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtmqw\" (UniqueName: \"kubernetes.io/projected/2ab8d84d-9110-4bed-8288-4764d7c10f74-kube-api-access-rtmqw\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.532028 4713 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2ab8d84d-9110-4bed-8288-4764d7c10f74-serviceca\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.548266 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c2c4a52-cb5b-4da1-9c2b-1bb839c14528" path="/var/lib/kubelet/pods/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528/volumes" Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.548724 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" path="/var/lib/kubelet/pods/dcde95f7-8814-4319-8a48-6d186de5f51f/volumes" Mar 08 00:12:45 crc kubenswrapper[4713]: I0308 00:12:45.060172 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29548800-ghv4d" event={"ID":"2ab8d84d-9110-4bed-8288-4764d7c10f74","Type":"ContainerDied","Data":"6fbb096291ab484496304a21d48e0c187a353974f802449b0a324f5c483976f8"} Mar 08 00:12:45 crc kubenswrapper[4713]: I0308 00:12:45.060374 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fbb096291ab484496304a21d48e0c187a353974f802449b0a324f5c483976f8" Mar 08 00:12:45 crc kubenswrapper[4713]: I0308 00:12:45.060206 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29548800-ghv4d" Mar 08 00:12:45 crc kubenswrapper[4713]: I0308 00:12:45.063064 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" event={"ID":"d80407f9-98a7-488a-aba0-f718da170a35","Type":"ContainerStarted","Data":"1de3ab5a5cf66f375f3d44be2148831ab8737f85f3f740b34100021c82990dda"} Mar 08 00:12:45 crc kubenswrapper[4713]: I0308 00:12:45.063460 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:45 crc kubenswrapper[4713]: I0308 00:12:45.069193 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:45 crc kubenswrapper[4713]: I0308 00:12:45.069655 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:12:45 crc kubenswrapper[4713]: I0308 00:12:45.095571 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" podStartSLOduration=4.095548315 podStartE2EDuration="4.095548315s" podCreationTimestamp="2026-03-08 00:12:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:12:45.087202977 +0000 UTC m=+419.206835230" watchObservedRunningTime="2026-03-08 00:12:45.095548315 +0000 UTC m=+419.215180558" Mar 08 00:13:01 crc kubenswrapper[4713]: I0308 00:13:01.396805 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-854bb687b5-6d9zw"] Mar 08 00:13:01 crc kubenswrapper[4713]: I0308 00:13:01.397561 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" podUID="d80407f9-98a7-488a-aba0-f718da170a35" containerName="controller-manager" containerID="cri-o://1de3ab5a5cf66f375f3d44be2148831ab8737f85f3f740b34100021c82990dda" gracePeriod=30 Mar 08 00:13:01 crc kubenswrapper[4713]: I0308 00:13:01.410104 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd"] Mar 08 00:13:01 crc kubenswrapper[4713]: I0308 00:13:01.410585 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" podUID="97ffa397-8c2d-4614-81c8-f0bd196db252" containerName="route-controller-manager" containerID="cri-o://e77a05b39ea8975e4c9eb1dc5876f187e0cb360fa48f54f8bb6ea89f77ca58a4" gracePeriod=30 Mar 08 00:13:01 crc kubenswrapper[4713]: I0308 00:13:01.976469 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:13:01 crc kubenswrapper[4713]: I0308 00:13:01.993241 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.136946 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97ffa397-8c2d-4614-81c8-f0bd196db252-client-ca\") pod \"97ffa397-8c2d-4614-81c8-f0bd196db252\" (UID: \"97ffa397-8c2d-4614-81c8-f0bd196db252\") " Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.136997 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97ffa397-8c2d-4614-81c8-f0bd196db252-serving-cert\") pod \"97ffa397-8c2d-4614-81c8-f0bd196db252\" (UID: \"97ffa397-8c2d-4614-81c8-f0bd196db252\") " Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.137031 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d80407f9-98a7-488a-aba0-f718da170a35-serving-cert\") pod \"d80407f9-98a7-488a-aba0-f718da170a35\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.137058 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-client-ca\") pod \"d80407f9-98a7-488a-aba0-f718da170a35\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.137843 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97ffa397-8c2d-4614-81c8-f0bd196db252-client-ca" (OuterVolumeSpecName: "client-ca") pod "97ffa397-8c2d-4614-81c8-f0bd196db252" (UID: "97ffa397-8c2d-4614-81c8-f0bd196db252"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.137973 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-client-ca" (OuterVolumeSpecName: "client-ca") pod "d80407f9-98a7-488a-aba0-f718da170a35" (UID: "d80407f9-98a7-488a-aba0-f718da170a35"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.138050 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wb5vf\" (UniqueName: \"kubernetes.io/projected/97ffa397-8c2d-4614-81c8-f0bd196db252-kube-api-access-wb5vf\") pod \"97ffa397-8c2d-4614-81c8-f0bd196db252\" (UID: \"97ffa397-8c2d-4614-81c8-f0bd196db252\") " Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.138100 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-config\") pod \"d80407f9-98a7-488a-aba0-f718da170a35\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.138117 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-proxy-ca-bundles\") pod \"d80407f9-98a7-488a-aba0-f718da170a35\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.138139 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97ffa397-8c2d-4614-81c8-f0bd196db252-config\") pod \"97ffa397-8c2d-4614-81c8-f0bd196db252\" (UID: \"97ffa397-8c2d-4614-81c8-f0bd196db252\") " Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.138175 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pnqj\" (UniqueName: \"kubernetes.io/projected/d80407f9-98a7-488a-aba0-f718da170a35-kube-api-access-6pnqj\") pod \"d80407f9-98a7-488a-aba0-f718da170a35\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.138509 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-config" (OuterVolumeSpecName: "config") pod "d80407f9-98a7-488a-aba0-f718da170a35" (UID: "d80407f9-98a7-488a-aba0-f718da170a35"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.138549 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d80407f9-98a7-488a-aba0-f718da170a35" (UID: "d80407f9-98a7-488a-aba0-f718da170a35"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.138874 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97ffa397-8c2d-4614-81c8-f0bd196db252-config" (OuterVolumeSpecName: "config") pod "97ffa397-8c2d-4614-81c8-f0bd196db252" (UID: "97ffa397-8c2d-4614-81c8-f0bd196db252"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.139077 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.139125 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.139138 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97ffa397-8c2d-4614-81c8-f0bd196db252-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.139146 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97ffa397-8c2d-4614-81c8-f0bd196db252-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.139155 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.142277 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97ffa397-8c2d-4614-81c8-f0bd196db252-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "97ffa397-8c2d-4614-81c8-f0bd196db252" (UID: "97ffa397-8c2d-4614-81c8-f0bd196db252"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.142328 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d80407f9-98a7-488a-aba0-f718da170a35-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d80407f9-98a7-488a-aba0-f718da170a35" (UID: "d80407f9-98a7-488a-aba0-f718da170a35"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.142944 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97ffa397-8c2d-4614-81c8-f0bd196db252-kube-api-access-wb5vf" (OuterVolumeSpecName: "kube-api-access-wb5vf") pod "97ffa397-8c2d-4614-81c8-f0bd196db252" (UID: "97ffa397-8c2d-4614-81c8-f0bd196db252"). InnerVolumeSpecName "kube-api-access-wb5vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.143800 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d80407f9-98a7-488a-aba0-f718da170a35-kube-api-access-6pnqj" (OuterVolumeSpecName: "kube-api-access-6pnqj") pod "d80407f9-98a7-488a-aba0-f718da170a35" (UID: "d80407f9-98a7-488a-aba0-f718da170a35"). InnerVolumeSpecName "kube-api-access-6pnqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.155030 4713 generic.go:334] "Generic (PLEG): container finished" podID="97ffa397-8c2d-4614-81c8-f0bd196db252" containerID="e77a05b39ea8975e4c9eb1dc5876f187e0cb360fa48f54f8bb6ea89f77ca58a4" exitCode=0 Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.155100 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" event={"ID":"97ffa397-8c2d-4614-81c8-f0bd196db252","Type":"ContainerDied","Data":"e77a05b39ea8975e4c9eb1dc5876f187e0cb360fa48f54f8bb6ea89f77ca58a4"} Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.155132 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" event={"ID":"97ffa397-8c2d-4614-81c8-f0bd196db252","Type":"ContainerDied","Data":"f98a407edc02834035ff48f1d7184aacc2041ec72127750f74d7bd1587b0b9d2"} Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.155151 4713 scope.go:117] "RemoveContainer" containerID="e77a05b39ea8975e4c9eb1dc5876f187e0cb360fa48f54f8bb6ea89f77ca58a4" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.155268 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.159344 4713 generic.go:334] "Generic (PLEG): container finished" podID="d80407f9-98a7-488a-aba0-f718da170a35" containerID="1de3ab5a5cf66f375f3d44be2148831ab8737f85f3f740b34100021c82990dda" exitCode=0 Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.159371 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.159396 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" event={"ID":"d80407f9-98a7-488a-aba0-f718da170a35","Type":"ContainerDied","Data":"1de3ab5a5cf66f375f3d44be2148831ab8737f85f3f740b34100021c82990dda"} Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.159430 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" event={"ID":"d80407f9-98a7-488a-aba0-f718da170a35","Type":"ContainerDied","Data":"a828f2f69d7e7e4fb80afb9cc983c532df289af847177c4e0ec6d1fbe997c392"} Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.172737 4713 scope.go:117] "RemoveContainer" containerID="e77a05b39ea8975e4c9eb1dc5876f187e0cb360fa48f54f8bb6ea89f77ca58a4" Mar 08 00:13:02 crc kubenswrapper[4713]: E0308 00:13:02.173448 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e77a05b39ea8975e4c9eb1dc5876f187e0cb360fa48f54f8bb6ea89f77ca58a4\": container with ID starting with e77a05b39ea8975e4c9eb1dc5876f187e0cb360fa48f54f8bb6ea89f77ca58a4 not found: ID does not exist" containerID="e77a05b39ea8975e4c9eb1dc5876f187e0cb360fa48f54f8bb6ea89f77ca58a4" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.173477 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e77a05b39ea8975e4c9eb1dc5876f187e0cb360fa48f54f8bb6ea89f77ca58a4"} err="failed to get container status \"e77a05b39ea8975e4c9eb1dc5876f187e0cb360fa48f54f8bb6ea89f77ca58a4\": rpc error: code = NotFound desc = could not find container \"e77a05b39ea8975e4c9eb1dc5876f187e0cb360fa48f54f8bb6ea89f77ca58a4\": container with ID starting with e77a05b39ea8975e4c9eb1dc5876f187e0cb360fa48f54f8bb6ea89f77ca58a4 not found: ID does not exist" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.173499 4713 scope.go:117] "RemoveContainer" containerID="1de3ab5a5cf66f375f3d44be2148831ab8737f85f3f740b34100021c82990dda" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.181325 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd"] Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.184378 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd"] Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.190781 4713 scope.go:117] "RemoveContainer" containerID="1de3ab5a5cf66f375f3d44be2148831ab8737f85f3f740b34100021c82990dda" Mar 08 00:13:02 crc kubenswrapper[4713]: E0308 00:13:02.191233 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1de3ab5a5cf66f375f3d44be2148831ab8737f85f3f740b34100021c82990dda\": container with ID starting with 1de3ab5a5cf66f375f3d44be2148831ab8737f85f3f740b34100021c82990dda not found: ID does not exist" containerID="1de3ab5a5cf66f375f3d44be2148831ab8737f85f3f740b34100021c82990dda" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.191268 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1de3ab5a5cf66f375f3d44be2148831ab8737f85f3f740b34100021c82990dda"} err="failed to get container status \"1de3ab5a5cf66f375f3d44be2148831ab8737f85f3f740b34100021c82990dda\": rpc error: code = NotFound desc = could not find container \"1de3ab5a5cf66f375f3d44be2148831ab8737f85f3f740b34100021c82990dda\": container with ID starting with 1de3ab5a5cf66f375f3d44be2148831ab8737f85f3f740b34100021c82990dda not found: ID does not exist" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.192425 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-854bb687b5-6d9zw"] Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.197049 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-854bb687b5-6d9zw"] Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.240871 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wb5vf\" (UniqueName: \"kubernetes.io/projected/97ffa397-8c2d-4614-81c8-f0bd196db252-kube-api-access-wb5vf\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.240910 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pnqj\" (UniqueName: \"kubernetes.io/projected/d80407f9-98a7-488a-aba0-f718da170a35-kube-api-access-6pnqj\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.240923 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97ffa397-8c2d-4614-81c8-f0bd196db252-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.240934 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d80407f9-98a7-488a-aba0-f718da170a35-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.548184 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97ffa397-8c2d-4614-81c8-f0bd196db252" path="/var/lib/kubelet/pods/97ffa397-8c2d-4614-81c8-f0bd196db252/volumes" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.548688 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d80407f9-98a7-488a-aba0-f718da170a35" path="/var/lib/kubelet/pods/d80407f9-98a7-488a-aba0-f718da170a35/volumes" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.956352 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7"] Mar 08 00:13:02 crc kubenswrapper[4713]: E0308 00:13:02.956900 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab8d84d-9110-4bed-8288-4764d7c10f74" containerName="image-pruner" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.956916 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab8d84d-9110-4bed-8288-4764d7c10f74" containerName="image-pruner" Mar 08 00:13:02 crc kubenswrapper[4713]: E0308 00:13:02.956931 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97ffa397-8c2d-4614-81c8-f0bd196db252" containerName="route-controller-manager" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.956939 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="97ffa397-8c2d-4614-81c8-f0bd196db252" containerName="route-controller-manager" Mar 08 00:13:02 crc kubenswrapper[4713]: E0308 00:13:02.956950 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d80407f9-98a7-488a-aba0-f718da170a35" containerName="controller-manager" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.956958 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="d80407f9-98a7-488a-aba0-f718da170a35" containerName="controller-manager" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.957053 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab8d84d-9110-4bed-8288-4764d7c10f74" containerName="image-pruner" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.957064 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="97ffa397-8c2d-4614-81c8-f0bd196db252" containerName="route-controller-manager" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.957071 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="d80407f9-98a7-488a-aba0-f718da170a35" containerName="controller-manager" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.957409 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.960240 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-565fb68b56-2gcqx"] Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.960733 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.962027 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.962036 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.962303 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.962386 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.962408 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.964101 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.964297 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.964879 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.965075 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.965290 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.971963 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7"] Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.973582 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.973668 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.976105 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.981118 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-565fb68b56-2gcqx"] Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.150622 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxwhh\" (UniqueName: \"kubernetes.io/projected/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-kube-api-access-jxwhh\") pod \"route-controller-manager-b4cc9495d-jlqd7\" (UID: \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\") " pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.150675 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ef732ea-c325-44b3-9624-63ea4f20e3c5-client-ca\") pod \"controller-manager-565fb68b56-2gcqx\" (UID: \"6ef732ea-c325-44b3-9624-63ea4f20e3c5\") " pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.150715 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6ef732ea-c325-44b3-9624-63ea4f20e3c5-proxy-ca-bundles\") pod \"controller-manager-565fb68b56-2gcqx\" (UID: \"6ef732ea-c325-44b3-9624-63ea4f20e3c5\") " pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.150780 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-serving-cert\") pod \"route-controller-manager-b4cc9495d-jlqd7\" (UID: \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\") " pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.150805 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ef732ea-c325-44b3-9624-63ea4f20e3c5-config\") pod \"controller-manager-565fb68b56-2gcqx\" (UID: \"6ef732ea-c325-44b3-9624-63ea4f20e3c5\") " pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.150882 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-client-ca\") pod \"route-controller-manager-b4cc9495d-jlqd7\" (UID: \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\") " pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.150968 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ef732ea-c325-44b3-9624-63ea4f20e3c5-serving-cert\") pod \"controller-manager-565fb68b56-2gcqx\" (UID: \"6ef732ea-c325-44b3-9624-63ea4f20e3c5\") " pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.151005 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-config\") pod \"route-controller-manager-b4cc9495d-jlqd7\" (UID: \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\") " pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.151036 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnjrn\" (UniqueName: \"kubernetes.io/projected/6ef732ea-c325-44b3-9624-63ea4f20e3c5-kube-api-access-hnjrn\") pod \"controller-manager-565fb68b56-2gcqx\" (UID: \"6ef732ea-c325-44b3-9624-63ea4f20e3c5\") " pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.252065 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6ef732ea-c325-44b3-9624-63ea4f20e3c5-proxy-ca-bundles\") pod \"controller-manager-565fb68b56-2gcqx\" (UID: \"6ef732ea-c325-44b3-9624-63ea4f20e3c5\") " pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.252132 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-serving-cert\") pod \"route-controller-manager-b4cc9495d-jlqd7\" (UID: \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\") " pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.252158 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ef732ea-c325-44b3-9624-63ea4f20e3c5-config\") pod \"controller-manager-565fb68b56-2gcqx\" (UID: \"6ef732ea-c325-44b3-9624-63ea4f20e3c5\") " pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.252183 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-client-ca\") pod \"route-controller-manager-b4cc9495d-jlqd7\" (UID: \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\") " pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.252211 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ef732ea-c325-44b3-9624-63ea4f20e3c5-serving-cert\") pod \"controller-manager-565fb68b56-2gcqx\" (UID: \"6ef732ea-c325-44b3-9624-63ea4f20e3c5\") " pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.252236 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-config\") pod \"route-controller-manager-b4cc9495d-jlqd7\" (UID: \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\") " pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.252265 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnjrn\" (UniqueName: \"kubernetes.io/projected/6ef732ea-c325-44b3-9624-63ea4f20e3c5-kube-api-access-hnjrn\") pod \"controller-manager-565fb68b56-2gcqx\" (UID: \"6ef732ea-c325-44b3-9624-63ea4f20e3c5\") " pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.252314 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxwhh\" (UniqueName: \"kubernetes.io/projected/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-kube-api-access-jxwhh\") pod \"route-controller-manager-b4cc9495d-jlqd7\" (UID: \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\") " pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.252340 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ef732ea-c325-44b3-9624-63ea4f20e3c5-client-ca\") pod \"controller-manager-565fb68b56-2gcqx\" (UID: \"6ef732ea-c325-44b3-9624-63ea4f20e3c5\") " pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.253474 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ef732ea-c325-44b3-9624-63ea4f20e3c5-client-ca\") pod \"controller-manager-565fb68b56-2gcqx\" (UID: \"6ef732ea-c325-44b3-9624-63ea4f20e3c5\") " pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.253893 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-client-ca\") pod \"route-controller-manager-b4cc9495d-jlqd7\" (UID: \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\") " pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.253898 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6ef732ea-c325-44b3-9624-63ea4f20e3c5-proxy-ca-bundles\") pod \"controller-manager-565fb68b56-2gcqx\" (UID: \"6ef732ea-c325-44b3-9624-63ea4f20e3c5\") " pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.254320 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-config\") pod \"route-controller-manager-b4cc9495d-jlqd7\" (UID: \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\") " pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.254424 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ef732ea-c325-44b3-9624-63ea4f20e3c5-config\") pod \"controller-manager-565fb68b56-2gcqx\" (UID: \"6ef732ea-c325-44b3-9624-63ea4f20e3c5\") " pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.256810 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-serving-cert\") pod \"route-controller-manager-b4cc9495d-jlqd7\" (UID: \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\") " pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.257312 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ef732ea-c325-44b3-9624-63ea4f20e3c5-serving-cert\") pod \"controller-manager-565fb68b56-2gcqx\" (UID: \"6ef732ea-c325-44b3-9624-63ea4f20e3c5\") " pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.270213 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxwhh\" (UniqueName: \"kubernetes.io/projected/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-kube-api-access-jxwhh\") pod \"route-controller-manager-b4cc9495d-jlqd7\" (UID: \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\") " pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.275750 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnjrn\" (UniqueName: \"kubernetes.io/projected/6ef732ea-c325-44b3-9624-63ea4f20e3c5-kube-api-access-hnjrn\") pod \"controller-manager-565fb68b56-2gcqx\" (UID: \"6ef732ea-c325-44b3-9624-63ea4f20e3c5\") " pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.279958 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.288753 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.671655 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-565fb68b56-2gcqx"] Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.778482 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7"] Mar 08 00:13:03 crc kubenswrapper[4713]: W0308 00:13:03.781019 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf42dbb0_d1f1_44a1_8f0f_f26bcae1ec2f.slice/crio-625788b3abcc99d5de48b9c586df8a5a324171e19f6fe5959a7369dde47d6f2c WatchSource:0}: Error finding container 625788b3abcc99d5de48b9c586df8a5a324171e19f6fe5959a7369dde47d6f2c: Status 404 returned error can't find the container with id 625788b3abcc99d5de48b9c586df8a5a324171e19f6fe5959a7369dde47d6f2c Mar 08 00:13:04 crc kubenswrapper[4713]: I0308 00:13:04.172260 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" event={"ID":"6ef732ea-c325-44b3-9624-63ea4f20e3c5","Type":"ContainerStarted","Data":"12dcbf3a5435ed5281f646f5d6ca495ee6e9e4efd37433b82af66cc6b99c1ca7"} Mar 08 00:13:04 crc kubenswrapper[4713]: I0308 00:13:04.172642 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:04 crc kubenswrapper[4713]: I0308 00:13:04.172661 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" event={"ID":"6ef732ea-c325-44b3-9624-63ea4f20e3c5","Type":"ContainerStarted","Data":"7d6af5756e571ffd3e794b0bf99d5433d1152e4315fafa69d81b14c70429744d"} Mar 08 00:13:04 crc kubenswrapper[4713]: I0308 00:13:04.173703 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" event={"ID":"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f","Type":"ContainerStarted","Data":"b3abd0c083801939f433ca544c64b04f93c4cb7413cde9fdfa35b6d07230fe7c"} Mar 08 00:13:04 crc kubenswrapper[4713]: I0308 00:13:04.173758 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" event={"ID":"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f","Type":"ContainerStarted","Data":"625788b3abcc99d5de48b9c586df8a5a324171e19f6fe5959a7369dde47d6f2c"} Mar 08 00:13:04 crc kubenswrapper[4713]: I0308 00:13:04.174210 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:04 crc kubenswrapper[4713]: I0308 00:13:04.178403 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:04 crc kubenswrapper[4713]: I0308 00:13:04.180751 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:04 crc kubenswrapper[4713]: I0308 00:13:04.187581 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" podStartSLOduration=3.187561811 podStartE2EDuration="3.187561811s" podCreationTimestamp="2026-03-08 00:13:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:13:04.186035011 +0000 UTC m=+438.305667264" watchObservedRunningTime="2026-03-08 00:13:04.187561811 +0000 UTC m=+438.307194044" Mar 08 00:13:04 crc kubenswrapper[4713]: I0308 00:13:04.257610 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" podStartSLOduration=3.2575753069999998 podStartE2EDuration="3.257575307s" podCreationTimestamp="2026-03-08 00:13:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:13:04.254553428 +0000 UTC m=+438.374185681" watchObservedRunningTime="2026-03-08 00:13:04.257575307 +0000 UTC m=+438.377207540" Mar 08 00:13:04 crc kubenswrapper[4713]: I0308 00:13:04.500849 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:13:04 crc kubenswrapper[4713]: I0308 00:13:04.500923 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:13:21 crc kubenswrapper[4713]: I0308 00:13:21.392531 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7"] Mar 08 00:13:21 crc kubenswrapper[4713]: I0308 00:13:21.393329 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" podUID="bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f" containerName="route-controller-manager" containerID="cri-o://b3abd0c083801939f433ca544c64b04f93c4cb7413cde9fdfa35b6d07230fe7c" gracePeriod=30 Mar 08 00:13:21 crc kubenswrapper[4713]: I0308 00:13:21.876778 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:21 crc kubenswrapper[4713]: I0308 00:13:21.946174 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-client-ca\") pod \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\" (UID: \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\") " Mar 08 00:13:21 crc kubenswrapper[4713]: I0308 00:13:21.946305 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-config\") pod \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\" (UID: \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\") " Mar 08 00:13:21 crc kubenswrapper[4713]: I0308 00:13:21.946616 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-serving-cert\") pod \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\" (UID: \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\") " Mar 08 00:13:21 crc kubenswrapper[4713]: I0308 00:13:21.946669 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxwhh\" (UniqueName: \"kubernetes.io/projected/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-kube-api-access-jxwhh\") pod \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\" (UID: \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\") " Mar 08 00:13:21 crc kubenswrapper[4713]: I0308 00:13:21.947176 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-client-ca" (OuterVolumeSpecName: "client-ca") pod "bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f" (UID: "bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:13:21 crc kubenswrapper[4713]: I0308 00:13:21.947236 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-config" (OuterVolumeSpecName: "config") pod "bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f" (UID: "bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:13:21 crc kubenswrapper[4713]: I0308 00:13:21.952702 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f" (UID: "bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:13:21 crc kubenswrapper[4713]: I0308 00:13:21.953429 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-kube-api-access-jxwhh" (OuterVolumeSpecName: "kube-api-access-jxwhh") pod "bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f" (UID: "bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f"). InnerVolumeSpecName "kube-api-access-jxwhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.048033 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.048062 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.048076 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxwhh\" (UniqueName: \"kubernetes.io/projected/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-kube-api-access-jxwhh\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.048087 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.268461 4713 generic.go:334] "Generic (PLEG): container finished" podID="bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f" containerID="b3abd0c083801939f433ca544c64b04f93c4cb7413cde9fdfa35b6d07230fe7c" exitCode=0 Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.268529 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" event={"ID":"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f","Type":"ContainerDied","Data":"b3abd0c083801939f433ca544c64b04f93c4cb7413cde9fdfa35b6d07230fe7c"} Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.268587 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" event={"ID":"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f","Type":"ContainerDied","Data":"625788b3abcc99d5de48b9c586df8a5a324171e19f6fe5959a7369dde47d6f2c"} Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.268586 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.268656 4713 scope.go:117] "RemoveContainer" containerID="b3abd0c083801939f433ca544c64b04f93c4cb7413cde9fdfa35b6d07230fe7c" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.284783 4713 scope.go:117] "RemoveContainer" containerID="b3abd0c083801939f433ca544c64b04f93c4cb7413cde9fdfa35b6d07230fe7c" Mar 08 00:13:22 crc kubenswrapper[4713]: E0308 00:13:22.285274 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3abd0c083801939f433ca544c64b04f93c4cb7413cde9fdfa35b6d07230fe7c\": container with ID starting with b3abd0c083801939f433ca544c64b04f93c4cb7413cde9fdfa35b6d07230fe7c not found: ID does not exist" containerID="b3abd0c083801939f433ca544c64b04f93c4cb7413cde9fdfa35b6d07230fe7c" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.285317 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3abd0c083801939f433ca544c64b04f93c4cb7413cde9fdfa35b6d07230fe7c"} err="failed to get container status \"b3abd0c083801939f433ca544c64b04f93c4cb7413cde9fdfa35b6d07230fe7c\": rpc error: code = NotFound desc = could not find container \"b3abd0c083801939f433ca544c64b04f93c4cb7413cde9fdfa35b6d07230fe7c\": container with ID starting with b3abd0c083801939f433ca544c64b04f93c4cb7413cde9fdfa35b6d07230fe7c not found: ID does not exist" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.308954 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7"] Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.312228 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7"] Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.549177 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f" path="/var/lib/kubelet/pods/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f/volumes" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.967517 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq"] Mar 08 00:13:22 crc kubenswrapper[4713]: E0308 00:13:22.968176 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f" containerName="route-controller-manager" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.968256 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f" containerName="route-controller-manager" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.968406 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f" containerName="route-controller-manager" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.968797 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.971215 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.971588 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.972013 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.972544 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.972924 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.973228 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.979860 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq"] Mar 08 00:13:23 crc kubenswrapper[4713]: I0308 00:13:23.063114 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/741fdcbc-fc9d-499a-958e-0e605cb9a874-serving-cert\") pod \"route-controller-manager-86b9c7bbc4-q2clq\" (UID: \"741fdcbc-fc9d-499a-958e-0e605cb9a874\") " pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" Mar 08 00:13:23 crc kubenswrapper[4713]: I0308 00:13:23.063165 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/741fdcbc-fc9d-499a-958e-0e605cb9a874-client-ca\") pod \"route-controller-manager-86b9c7bbc4-q2clq\" (UID: \"741fdcbc-fc9d-499a-958e-0e605cb9a874\") " pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" Mar 08 00:13:23 crc kubenswrapper[4713]: I0308 00:13:23.063198 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/741fdcbc-fc9d-499a-958e-0e605cb9a874-config\") pod \"route-controller-manager-86b9c7bbc4-q2clq\" (UID: \"741fdcbc-fc9d-499a-958e-0e605cb9a874\") " pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" Mar 08 00:13:23 crc kubenswrapper[4713]: I0308 00:13:23.063232 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw2gr\" (UniqueName: \"kubernetes.io/projected/741fdcbc-fc9d-499a-958e-0e605cb9a874-kube-api-access-xw2gr\") pod \"route-controller-manager-86b9c7bbc4-q2clq\" (UID: \"741fdcbc-fc9d-499a-958e-0e605cb9a874\") " pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" Mar 08 00:13:23 crc kubenswrapper[4713]: I0308 00:13:23.163916 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/741fdcbc-fc9d-499a-958e-0e605cb9a874-config\") pod \"route-controller-manager-86b9c7bbc4-q2clq\" (UID: \"741fdcbc-fc9d-499a-958e-0e605cb9a874\") " pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" Mar 08 00:13:23 crc kubenswrapper[4713]: I0308 00:13:23.163989 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw2gr\" (UniqueName: \"kubernetes.io/projected/741fdcbc-fc9d-499a-958e-0e605cb9a874-kube-api-access-xw2gr\") pod \"route-controller-manager-86b9c7bbc4-q2clq\" (UID: \"741fdcbc-fc9d-499a-958e-0e605cb9a874\") " pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" Mar 08 00:13:23 crc kubenswrapper[4713]: I0308 00:13:23.164054 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/741fdcbc-fc9d-499a-958e-0e605cb9a874-serving-cert\") pod \"route-controller-manager-86b9c7bbc4-q2clq\" (UID: \"741fdcbc-fc9d-499a-958e-0e605cb9a874\") " pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" Mar 08 00:13:23 crc kubenswrapper[4713]: I0308 00:13:23.164086 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/741fdcbc-fc9d-499a-958e-0e605cb9a874-client-ca\") pod \"route-controller-manager-86b9c7bbc4-q2clq\" (UID: \"741fdcbc-fc9d-499a-958e-0e605cb9a874\") " pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" Mar 08 00:13:23 crc kubenswrapper[4713]: I0308 00:13:23.165044 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/741fdcbc-fc9d-499a-958e-0e605cb9a874-client-ca\") pod \"route-controller-manager-86b9c7bbc4-q2clq\" (UID: \"741fdcbc-fc9d-499a-958e-0e605cb9a874\") " pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" Mar 08 00:13:23 crc kubenswrapper[4713]: I0308 00:13:23.165376 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/741fdcbc-fc9d-499a-958e-0e605cb9a874-config\") pod \"route-controller-manager-86b9c7bbc4-q2clq\" (UID: \"741fdcbc-fc9d-499a-958e-0e605cb9a874\") " pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" Mar 08 00:13:23 crc kubenswrapper[4713]: I0308 00:13:23.167921 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/741fdcbc-fc9d-499a-958e-0e605cb9a874-serving-cert\") pod \"route-controller-manager-86b9c7bbc4-q2clq\" (UID: \"741fdcbc-fc9d-499a-958e-0e605cb9a874\") " pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" Mar 08 00:13:23 crc kubenswrapper[4713]: I0308 00:13:23.184633 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw2gr\" (UniqueName: \"kubernetes.io/projected/741fdcbc-fc9d-499a-958e-0e605cb9a874-kube-api-access-xw2gr\") pod \"route-controller-manager-86b9c7bbc4-q2clq\" (UID: \"741fdcbc-fc9d-499a-958e-0e605cb9a874\") " pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" Mar 08 00:13:23 crc kubenswrapper[4713]: I0308 00:13:23.286630 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" Mar 08 00:13:23 crc kubenswrapper[4713]: I0308 00:13:23.674631 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq"] Mar 08 00:13:24 crc kubenswrapper[4713]: I0308 00:13:24.279294 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" event={"ID":"741fdcbc-fc9d-499a-958e-0e605cb9a874","Type":"ContainerStarted","Data":"dece7a0e6f49a84784dc86cc91c70b8267cc1d04fbfe058516aeb00cf435ee85"} Mar 08 00:13:24 crc kubenswrapper[4713]: I0308 00:13:24.279612 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" event={"ID":"741fdcbc-fc9d-499a-958e-0e605cb9a874","Type":"ContainerStarted","Data":"73ffdca953cd89eabe24a66d50ad9de20666a72933814c8c2e19bd8e6aa00922"} Mar 08 00:13:24 crc kubenswrapper[4713]: I0308 00:13:24.279628 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" Mar 08 00:13:24 crc kubenswrapper[4713]: I0308 00:13:24.285745 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" Mar 08 00:13:24 crc kubenswrapper[4713]: I0308 00:13:24.298413 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" podStartSLOduration=3.298392686 podStartE2EDuration="3.298392686s" podCreationTimestamp="2026-03-08 00:13:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:13:24.294490624 +0000 UTC m=+458.414122867" watchObservedRunningTime="2026-03-08 00:13:24.298392686 +0000 UTC m=+458.418024929" Mar 08 00:13:34 crc kubenswrapper[4713]: I0308 00:13:34.501030 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:13:34 crc kubenswrapper[4713]: I0308 00:13:34.501979 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.331792 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x6gcb"] Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.333435 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x6gcb" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" containerName="registry-server" containerID="cri-o://99dd020645e7b6695acb2f758f9b98023643a329f5c7e44db6eec7c1278babd6" gracePeriod=30 Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.347847 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4tj99"] Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.348355 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4tj99" podUID="40864d72-e137-478e-8340-8c0f107b4c60" containerName="registry-server" containerID="cri-o://e4df11f30a00eeb8975bf590dfcc99035d1dbd89952445cfb19e1aa26d7407f6" gracePeriod=30 Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.355902 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p9hqz"] Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.356174 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" podUID="9e570b68-8b4c-42e3-839d-f37943999246" containerName="marketplace-operator" containerID="cri-o://fd9a48944f15c013216b1e59cc31e3539b1ac73b38b0051a0a81749066e50d41" gracePeriod=30 Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.370025 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5hssk"] Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.370927 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5hssk" podUID="822fdb72-7e7f-441b-8ebc-178ef46cca73" containerName="registry-server" containerID="cri-o://4cfc44af3acab9f9da37265b5df0c44c4ce8481c6b73a6a1c6911e1394713817" gracePeriod=30 Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.375126 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-57pjt"] Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.375444 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-57pjt" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" containerName="registry-server" containerID="cri-o://4ed848ed6abb07f4a89c3ace3ce761bce0134ceff6e51ed39e7ca6d27a1477c1" gracePeriod=30 Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.379307 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4bm59"] Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.380268 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4bm59" Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.391220 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4bm59"] Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.485025 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/26e0cfc6-458c-4be3-b57c-1cd5fad657c4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4bm59\" (UID: \"26e0cfc6-458c-4be3-b57c-1cd5fad657c4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bm59" Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.485116 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/26e0cfc6-458c-4be3-b57c-1cd5fad657c4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4bm59\" (UID: \"26e0cfc6-458c-4be3-b57c-1cd5fad657c4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bm59" Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.485157 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhqqr\" (UniqueName: \"kubernetes.io/projected/26e0cfc6-458c-4be3-b57c-1cd5fad657c4-kube-api-access-fhqqr\") pod \"marketplace-operator-79b997595-4bm59\" (UID: \"26e0cfc6-458c-4be3-b57c-1cd5fad657c4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bm59" Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.585694 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhqqr\" (UniqueName: \"kubernetes.io/projected/26e0cfc6-458c-4be3-b57c-1cd5fad657c4-kube-api-access-fhqqr\") pod \"marketplace-operator-79b997595-4bm59\" (UID: \"26e0cfc6-458c-4be3-b57c-1cd5fad657c4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bm59" Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.585740 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/26e0cfc6-458c-4be3-b57c-1cd5fad657c4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4bm59\" (UID: \"26e0cfc6-458c-4be3-b57c-1cd5fad657c4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bm59" Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.585816 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/26e0cfc6-458c-4be3-b57c-1cd5fad657c4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4bm59\" (UID: \"26e0cfc6-458c-4be3-b57c-1cd5fad657c4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bm59" Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.587295 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/26e0cfc6-458c-4be3-b57c-1cd5fad657c4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4bm59\" (UID: \"26e0cfc6-458c-4be3-b57c-1cd5fad657c4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bm59" Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.592805 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/26e0cfc6-458c-4be3-b57c-1cd5fad657c4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4bm59\" (UID: \"26e0cfc6-458c-4be3-b57c-1cd5fad657c4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bm59" Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.602860 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhqqr\" (UniqueName: \"kubernetes.io/projected/26e0cfc6-458c-4be3-b57c-1cd5fad657c4-kube-api-access-fhqqr\") pod \"marketplace-operator-79b997595-4bm59\" (UID: \"26e0cfc6-458c-4be3-b57c-1cd5fad657c4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bm59" Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.710217 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4bm59" Mar 08 00:13:50 crc kubenswrapper[4713]: E0308 00:13:50.846607 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 99dd020645e7b6695acb2f758f9b98023643a329f5c7e44db6eec7c1278babd6 is running failed: container process not found" containerID="99dd020645e7b6695acb2f758f9b98023643a329f5c7e44db6eec7c1278babd6" cmd=["grpc_health_probe","-addr=:50051"] Mar 08 00:13:50 crc kubenswrapper[4713]: E0308 00:13:50.847389 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 99dd020645e7b6695acb2f758f9b98023643a329f5c7e44db6eec7c1278babd6 is running failed: container process not found" containerID="99dd020645e7b6695acb2f758f9b98023643a329f5c7e44db6eec7c1278babd6" cmd=["grpc_health_probe","-addr=:50051"] Mar 08 00:13:50 crc kubenswrapper[4713]: E0308 00:13:50.847653 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 99dd020645e7b6695acb2f758f9b98023643a329f5c7e44db6eec7c1278babd6 is running failed: container process not found" containerID="99dd020645e7b6695acb2f758f9b98023643a329f5c7e44db6eec7c1278babd6" cmd=["grpc_health_probe","-addr=:50051"] Mar 08 00:13:50 crc kubenswrapper[4713]: E0308 00:13:50.847743 4713 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 99dd020645e7b6695acb2f758f9b98023643a329f5c7e44db6eec7c1278babd6 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-x6gcb" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" containerName="registry-server" Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.933528 4713 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-p9hqz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.933591 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" podUID="9e570b68-8b4c-42e3-839d-f37943999246" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Mar 08 00:13:51 crc kubenswrapper[4713]: E0308 00:13:51.042789 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e4df11f30a00eeb8975bf590dfcc99035d1dbd89952445cfb19e1aa26d7407f6 is running failed: container process not found" containerID="e4df11f30a00eeb8975bf590dfcc99035d1dbd89952445cfb19e1aa26d7407f6" cmd=["grpc_health_probe","-addr=:50051"] Mar 08 00:13:51 crc kubenswrapper[4713]: E0308 00:13:51.043276 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e4df11f30a00eeb8975bf590dfcc99035d1dbd89952445cfb19e1aa26d7407f6 is running failed: container process not found" containerID="e4df11f30a00eeb8975bf590dfcc99035d1dbd89952445cfb19e1aa26d7407f6" cmd=["grpc_health_probe","-addr=:50051"] Mar 08 00:13:51 crc kubenswrapper[4713]: E0308 00:13:51.043614 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e4df11f30a00eeb8975bf590dfcc99035d1dbd89952445cfb19e1aa26d7407f6 is running failed: container process not found" containerID="e4df11f30a00eeb8975bf590dfcc99035d1dbd89952445cfb19e1aa26d7407f6" cmd=["grpc_health_probe","-addr=:50051"] Mar 08 00:13:51 crc kubenswrapper[4713]: E0308 00:13:51.043676 4713 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e4df11f30a00eeb8975bf590dfcc99035d1dbd89952445cfb19e1aa26d7407f6 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-4tj99" podUID="40864d72-e137-478e-8340-8c0f107b4c60" containerName="registry-server" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.114547 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4bm59"] Mar 08 00:13:51 crc kubenswrapper[4713]: W0308 00:13:51.128233 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26e0cfc6_458c_4be3_b57c_1cd5fad657c4.slice/crio-4a98aca99092786cfe5fa97a753e75d75ea88d114f04bef2cdee1d3307f8e478 WatchSource:0}: Error finding container 4a98aca99092786cfe5fa97a753e75d75ea88d114f04bef2cdee1d3307f8e478: Status 404 returned error can't find the container with id 4a98aca99092786cfe5fa97a753e75d75ea88d114f04bef2cdee1d3307f8e478 Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.367566 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.447240 4713 generic.go:334] "Generic (PLEG): container finished" podID="40864d72-e137-478e-8340-8c0f107b4c60" containerID="e4df11f30a00eeb8975bf590dfcc99035d1dbd89952445cfb19e1aa26d7407f6" exitCode=0 Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.447327 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4tj99" event={"ID":"40864d72-e137-478e-8340-8c0f107b4c60","Type":"ContainerDied","Data":"e4df11f30a00eeb8975bf590dfcc99035d1dbd89952445cfb19e1aa26d7407f6"} Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.450029 4713 generic.go:334] "Generic (PLEG): container finished" podID="d9341928-7a63-4190-ac37-ac9ba3320e18" containerID="99dd020645e7b6695acb2f758f9b98023643a329f5c7e44db6eec7c1278babd6" exitCode=0 Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.450080 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6gcb" event={"ID":"d9341928-7a63-4190-ac37-ac9ba3320e18","Type":"ContainerDied","Data":"99dd020645e7b6695acb2f758f9b98023643a329f5c7e44db6eec7c1278babd6"} Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.450880 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4bm59" event={"ID":"26e0cfc6-458c-4be3-b57c-1cd5fad657c4","Type":"ContainerStarted","Data":"4a98aca99092786cfe5fa97a753e75d75ea88d114f04bef2cdee1d3307f8e478"} Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.452621 4713 generic.go:334] "Generic (PLEG): container finished" podID="822fdb72-7e7f-441b-8ebc-178ef46cca73" containerID="4cfc44af3acab9f9da37265b5df0c44c4ce8481c6b73a6a1c6911e1394713817" exitCode=0 Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.452694 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5hssk" event={"ID":"822fdb72-7e7f-441b-8ebc-178ef46cca73","Type":"ContainerDied","Data":"4cfc44af3acab9f9da37265b5df0c44c4ce8481c6b73a6a1c6911e1394713817"} Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.452712 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5hssk" event={"ID":"822fdb72-7e7f-441b-8ebc-178ef46cca73","Type":"ContainerDied","Data":"fcc1f03f798c9a1497a249637518dbb0a71923b3eba6d35aa4080c621862fa0f"} Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.452754 4713 scope.go:117] "RemoveContainer" containerID="4cfc44af3acab9f9da37265b5df0c44c4ce8481c6b73a6a1c6911e1394713817" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.452954 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.461923 4713 generic.go:334] "Generic (PLEG): container finished" podID="9e570b68-8b4c-42e3-839d-f37943999246" containerID="fd9a48944f15c013216b1e59cc31e3539b1ac73b38b0051a0a81749066e50d41" exitCode=0 Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.461999 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" event={"ID":"9e570b68-8b4c-42e3-839d-f37943999246","Type":"ContainerDied","Data":"fd9a48944f15c013216b1e59cc31e3539b1ac73b38b0051a0a81749066e50d41"} Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.469781 4713 generic.go:334] "Generic (PLEG): container finished" podID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" containerID="4ed848ed6abb07f4a89c3ace3ce761bce0134ceff6e51ed39e7ca6d27a1477c1" exitCode=0 Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.469826 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-57pjt" event={"ID":"e23a30a2-2bf8-451e-b85b-b293e8949e9e","Type":"ContainerDied","Data":"4ed848ed6abb07f4a89c3ace3ce761bce0134ceff6e51ed39e7ca6d27a1477c1"} Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.480480 4713 scope.go:117] "RemoveContainer" containerID="524dfa3729d8726beb09ae412f7321389ba47ef0624fa7d2798a1f20145b2133" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.498377 4713 scope.go:117] "RemoveContainer" containerID="fa81935375891e84987b059dfdea9629b743e60a7365748b113fb9a50d109ab1" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.499480 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsx97\" (UniqueName: \"kubernetes.io/projected/822fdb72-7e7f-441b-8ebc-178ef46cca73-kube-api-access-bsx97\") pod \"822fdb72-7e7f-441b-8ebc-178ef46cca73\" (UID: \"822fdb72-7e7f-441b-8ebc-178ef46cca73\") " Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.499513 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/822fdb72-7e7f-441b-8ebc-178ef46cca73-utilities\") pod \"822fdb72-7e7f-441b-8ebc-178ef46cca73\" (UID: \"822fdb72-7e7f-441b-8ebc-178ef46cca73\") " Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.499641 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/822fdb72-7e7f-441b-8ebc-178ef46cca73-catalog-content\") pod \"822fdb72-7e7f-441b-8ebc-178ef46cca73\" (UID: \"822fdb72-7e7f-441b-8ebc-178ef46cca73\") " Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.500630 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/822fdb72-7e7f-441b-8ebc-178ef46cca73-utilities" (OuterVolumeSpecName: "utilities") pod "822fdb72-7e7f-441b-8ebc-178ef46cca73" (UID: "822fdb72-7e7f-441b-8ebc-178ef46cca73"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.516447 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/822fdb72-7e7f-441b-8ebc-178ef46cca73-kube-api-access-bsx97" (OuterVolumeSpecName: "kube-api-access-bsx97") pod "822fdb72-7e7f-441b-8ebc-178ef46cca73" (UID: "822fdb72-7e7f-441b-8ebc-178ef46cca73"). InnerVolumeSpecName "kube-api-access-bsx97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.534970 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/822fdb72-7e7f-441b-8ebc-178ef46cca73-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "822fdb72-7e7f-441b-8ebc-178ef46cca73" (UID: "822fdb72-7e7f-441b-8ebc-178ef46cca73"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.577550 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.581283 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.582055 4713 scope.go:117] "RemoveContainer" containerID="4cfc44af3acab9f9da37265b5df0c44c4ce8481c6b73a6a1c6911e1394713817" Mar 08 00:13:51 crc kubenswrapper[4713]: E0308 00:13:51.582303 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cfc44af3acab9f9da37265b5df0c44c4ce8481c6b73a6a1c6911e1394713817\": container with ID starting with 4cfc44af3acab9f9da37265b5df0c44c4ce8481c6b73a6a1c6911e1394713817 not found: ID does not exist" containerID="4cfc44af3acab9f9da37265b5df0c44c4ce8481c6b73a6a1c6911e1394713817" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.582330 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cfc44af3acab9f9da37265b5df0c44c4ce8481c6b73a6a1c6911e1394713817"} err="failed to get container status \"4cfc44af3acab9f9da37265b5df0c44c4ce8481c6b73a6a1c6911e1394713817\": rpc error: code = NotFound desc = could not find container \"4cfc44af3acab9f9da37265b5df0c44c4ce8481c6b73a6a1c6911e1394713817\": container with ID starting with 4cfc44af3acab9f9da37265b5df0c44c4ce8481c6b73a6a1c6911e1394713817 not found: ID does not exist" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.582348 4713 scope.go:117] "RemoveContainer" containerID="524dfa3729d8726beb09ae412f7321389ba47ef0624fa7d2798a1f20145b2133" Mar 08 00:13:51 crc kubenswrapper[4713]: E0308 00:13:51.582573 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"524dfa3729d8726beb09ae412f7321389ba47ef0624fa7d2798a1f20145b2133\": container with ID starting with 524dfa3729d8726beb09ae412f7321389ba47ef0624fa7d2798a1f20145b2133 not found: ID does not exist" containerID="524dfa3729d8726beb09ae412f7321389ba47ef0624fa7d2798a1f20145b2133" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.582599 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"524dfa3729d8726beb09ae412f7321389ba47ef0624fa7d2798a1f20145b2133"} err="failed to get container status \"524dfa3729d8726beb09ae412f7321389ba47ef0624fa7d2798a1f20145b2133\": rpc error: code = NotFound desc = could not find container \"524dfa3729d8726beb09ae412f7321389ba47ef0624fa7d2798a1f20145b2133\": container with ID starting with 524dfa3729d8726beb09ae412f7321389ba47ef0624fa7d2798a1f20145b2133 not found: ID does not exist" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.582614 4713 scope.go:117] "RemoveContainer" containerID="fa81935375891e84987b059dfdea9629b743e60a7365748b113fb9a50d109ab1" Mar 08 00:13:51 crc kubenswrapper[4713]: E0308 00:13:51.582853 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa81935375891e84987b059dfdea9629b743e60a7365748b113fb9a50d109ab1\": container with ID starting with fa81935375891e84987b059dfdea9629b743e60a7365748b113fb9a50d109ab1 not found: ID does not exist" containerID="fa81935375891e84987b059dfdea9629b743e60a7365748b113fb9a50d109ab1" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.582875 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa81935375891e84987b059dfdea9629b743e60a7365748b113fb9a50d109ab1"} err="failed to get container status \"fa81935375891e84987b059dfdea9629b743e60a7365748b113fb9a50d109ab1\": rpc error: code = NotFound desc = could not find container \"fa81935375891e84987b059dfdea9629b743e60a7365748b113fb9a50d109ab1\": container with ID starting with fa81935375891e84987b059dfdea9629b743e60a7365748b113fb9a50d109ab1 not found: ID does not exist" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.586585 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.596643 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.601334 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsx97\" (UniqueName: \"kubernetes.io/projected/822fdb72-7e7f-441b-8ebc-178ef46cca73-kube-api-access-bsx97\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.601360 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/822fdb72-7e7f-441b-8ebc-178ef46cca73-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.601368 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/822fdb72-7e7f-441b-8ebc-178ef46cca73-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.701840 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-operator-metrics\") pod \"9e570b68-8b4c-42e3-839d-f37943999246\" (UID: \"9e570b68-8b4c-42e3-839d-f37943999246\") " Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.701914 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9341928-7a63-4190-ac37-ac9ba3320e18-catalog-content\") pod \"d9341928-7a63-4190-ac37-ac9ba3320e18\" (UID: \"d9341928-7a63-4190-ac37-ac9ba3320e18\") " Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.701969 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8fx2\" (UniqueName: \"kubernetes.io/projected/40864d72-e137-478e-8340-8c0f107b4c60-kube-api-access-m8fx2\") pod \"40864d72-e137-478e-8340-8c0f107b4c60\" (UID: \"40864d72-e137-478e-8340-8c0f107b4c60\") " Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.701985 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-trusted-ca\") pod \"9e570b68-8b4c-42e3-839d-f37943999246\" (UID: \"9e570b68-8b4c-42e3-839d-f37943999246\") " Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.701999 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e23a30a2-2bf8-451e-b85b-b293e8949e9e-catalog-content\") pod \"e23a30a2-2bf8-451e-b85b-b293e8949e9e\" (UID: \"e23a30a2-2bf8-451e-b85b-b293e8949e9e\") " Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.702015 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e23a30a2-2bf8-451e-b85b-b293e8949e9e-utilities\") pod \"e23a30a2-2bf8-451e-b85b-b293e8949e9e\" (UID: \"e23a30a2-2bf8-451e-b85b-b293e8949e9e\") " Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.702070 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-795x2\" (UniqueName: \"kubernetes.io/projected/9e570b68-8b4c-42e3-839d-f37943999246-kube-api-access-795x2\") pod \"9e570b68-8b4c-42e3-839d-f37943999246\" (UID: \"9e570b68-8b4c-42e3-839d-f37943999246\") " Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.702091 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9341928-7a63-4190-ac37-ac9ba3320e18-utilities\") pod \"d9341928-7a63-4190-ac37-ac9ba3320e18\" (UID: \"d9341928-7a63-4190-ac37-ac9ba3320e18\") " Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.702110 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40864d72-e137-478e-8340-8c0f107b4c60-catalog-content\") pod \"40864d72-e137-478e-8340-8c0f107b4c60\" (UID: \"40864d72-e137-478e-8340-8c0f107b4c60\") " Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.702129 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfdss\" (UniqueName: \"kubernetes.io/projected/e23a30a2-2bf8-451e-b85b-b293e8949e9e-kube-api-access-kfdss\") pod \"e23a30a2-2bf8-451e-b85b-b293e8949e9e\" (UID: \"e23a30a2-2bf8-451e-b85b-b293e8949e9e\") " Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.702152 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prrdn\" (UniqueName: \"kubernetes.io/projected/d9341928-7a63-4190-ac37-ac9ba3320e18-kube-api-access-prrdn\") pod \"d9341928-7a63-4190-ac37-ac9ba3320e18\" (UID: \"d9341928-7a63-4190-ac37-ac9ba3320e18\") " Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.702182 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40864d72-e137-478e-8340-8c0f107b4c60-utilities\") pod \"40864d72-e137-478e-8340-8c0f107b4c60\" (UID: \"40864d72-e137-478e-8340-8c0f107b4c60\") " Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.703079 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e23a30a2-2bf8-451e-b85b-b293e8949e9e-utilities" (OuterVolumeSpecName: "utilities") pod "e23a30a2-2bf8-451e-b85b-b293e8949e9e" (UID: "e23a30a2-2bf8-451e-b85b-b293e8949e9e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.703163 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40864d72-e137-478e-8340-8c0f107b4c60-utilities" (OuterVolumeSpecName: "utilities") pod "40864d72-e137-478e-8340-8c0f107b4c60" (UID: "40864d72-e137-478e-8340-8c0f107b4c60"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.704693 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9341928-7a63-4190-ac37-ac9ba3320e18-utilities" (OuterVolumeSpecName: "utilities") pod "d9341928-7a63-4190-ac37-ac9ba3320e18" (UID: "d9341928-7a63-4190-ac37-ac9ba3320e18"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.705061 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "9e570b68-8b4c-42e3-839d-f37943999246" (UID: "9e570b68-8b4c-42e3-839d-f37943999246"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.705268 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40864d72-e137-478e-8340-8c0f107b4c60-kube-api-access-m8fx2" (OuterVolumeSpecName: "kube-api-access-m8fx2") pod "40864d72-e137-478e-8340-8c0f107b4c60" (UID: "40864d72-e137-478e-8340-8c0f107b4c60"). InnerVolumeSpecName "kube-api-access-m8fx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.705335 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e570b68-8b4c-42e3-839d-f37943999246-kube-api-access-795x2" (OuterVolumeSpecName: "kube-api-access-795x2") pod "9e570b68-8b4c-42e3-839d-f37943999246" (UID: "9e570b68-8b4c-42e3-839d-f37943999246"). InnerVolumeSpecName "kube-api-access-795x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.707643 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "9e570b68-8b4c-42e3-839d-f37943999246" (UID: "9e570b68-8b4c-42e3-839d-f37943999246"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.712043 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e23a30a2-2bf8-451e-b85b-b293e8949e9e-kube-api-access-kfdss" (OuterVolumeSpecName: "kube-api-access-kfdss") pod "e23a30a2-2bf8-451e-b85b-b293e8949e9e" (UID: "e23a30a2-2bf8-451e-b85b-b293e8949e9e"). InnerVolumeSpecName "kube-api-access-kfdss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.712161 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9341928-7a63-4190-ac37-ac9ba3320e18-kube-api-access-prrdn" (OuterVolumeSpecName: "kube-api-access-prrdn") pod "d9341928-7a63-4190-ac37-ac9ba3320e18" (UID: "d9341928-7a63-4190-ac37-ac9ba3320e18"). InnerVolumeSpecName "kube-api-access-prrdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.772673 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40864d72-e137-478e-8340-8c0f107b4c60-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40864d72-e137-478e-8340-8c0f107b4c60" (UID: "40864d72-e137-478e-8340-8c0f107b4c60"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.785058 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5hssk"] Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.785803 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9341928-7a63-4190-ac37-ac9ba3320e18-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9341928-7a63-4190-ac37-ac9ba3320e18" (UID: "d9341928-7a63-4190-ac37-ac9ba3320e18"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.788555 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5hssk"] Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.804101 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-795x2\" (UniqueName: \"kubernetes.io/projected/9e570b68-8b4c-42e3-839d-f37943999246-kube-api-access-795x2\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.804138 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9341928-7a63-4190-ac37-ac9ba3320e18-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.804151 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40864d72-e137-478e-8340-8c0f107b4c60-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.804165 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfdss\" (UniqueName: \"kubernetes.io/projected/e23a30a2-2bf8-451e-b85b-b293e8949e9e-kube-api-access-kfdss\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.804176 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prrdn\" (UniqueName: \"kubernetes.io/projected/d9341928-7a63-4190-ac37-ac9ba3320e18-kube-api-access-prrdn\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.804187 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40864d72-e137-478e-8340-8c0f107b4c60-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.804197 4713 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.804209 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9341928-7a63-4190-ac37-ac9ba3320e18-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.804219 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8fx2\" (UniqueName: \"kubernetes.io/projected/40864d72-e137-478e-8340-8c0f107b4c60-kube-api-access-m8fx2\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.804229 4713 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.804239 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e23a30a2-2bf8-451e-b85b-b293e8949e9e-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.834277 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e23a30a2-2bf8-451e-b85b-b293e8949e9e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e23a30a2-2bf8-451e-b85b-b293e8949e9e" (UID: "e23a30a2-2bf8-451e-b85b-b293e8949e9e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.905946 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e23a30a2-2bf8-451e-b85b-b293e8949e9e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.032759 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c8gbn"] Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.477018 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" event={"ID":"9e570b68-8b4c-42e3-839d-f37943999246","Type":"ContainerDied","Data":"8a2d896d73aedf449a67c5c1becd624d05fd0cc1bac64192c1528302ec9e1810"} Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.477529 4713 scope.go:117] "RemoveContainer" containerID="fd9a48944f15c013216b1e59cc31e3539b1ac73b38b0051a0a81749066e50d41" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.477089 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.488676 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-57pjt" event={"ID":"e23a30a2-2bf8-451e-b85b-b293e8949e9e","Type":"ContainerDied","Data":"7c30588800e0dac5ab38807a23f6184382c53099e569400f6073fb7739048d46"} Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.488886 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.493544 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4tj99" event={"ID":"40864d72-e137-478e-8340-8c0f107b4c60","Type":"ContainerDied","Data":"3cdea3678803ad7453d0a386b7a4a0468a866e4a3767422ad83b05a97ef4bf14"} Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.493672 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.495859 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6gcb" event={"ID":"d9341928-7a63-4190-ac37-ac9ba3320e18","Type":"ContainerDied","Data":"8da0f0760030352f0e71a9d8d27a1069de63fe3b39a327ba9c1b618d352e4f81"} Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.496062 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.499377 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4bm59" event={"ID":"26e0cfc6-458c-4be3-b57c-1cd5fad657c4","Type":"ContainerStarted","Data":"3b2176370935e6a2e1310e78999dfe2021e4e97c1e8a1c47e184b64c068dff71"} Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.499991 4713 scope.go:117] "RemoveContainer" containerID="4ed848ed6abb07f4a89c3ace3ce761bce0134ceff6e51ed39e7ca6d27a1477c1" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.509596 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p9hqz"] Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.518553 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p9hqz"] Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.525920 4713 scope.go:117] "RemoveContainer" containerID="71df55d2c41e29b364984f11829b378396c7e97525399c55ef7102e7db5b6a0a" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.532463 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-57pjt"] Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.536539 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-57pjt"] Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.548896 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="822fdb72-7e7f-441b-8ebc-178ef46cca73" path="/var/lib/kubelet/pods/822fdb72-7e7f-441b-8ebc-178ef46cca73/volumes" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.549787 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e570b68-8b4c-42e3-839d-f37943999246" path="/var/lib/kubelet/pods/9e570b68-8b4c-42e3-839d-f37943999246/volumes" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.550393 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" path="/var/lib/kubelet/pods/e23a30a2-2bf8-451e-b85b-b293e8949e9e/volumes" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.555676 4713 scope.go:117] "RemoveContainer" containerID="99ba221bc55466be0084d80442d6dec86c90deadbc054c19ec89fd1d01900208" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.558435 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4m4tz"] Mar 08 00:13:52 crc kubenswrapper[4713]: E0308 00:13:52.558706 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="822fdb72-7e7f-441b-8ebc-178ef46cca73" containerName="registry-server" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.558737 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="822fdb72-7e7f-441b-8ebc-178ef46cca73" containerName="registry-server" Mar 08 00:13:52 crc kubenswrapper[4713]: E0308 00:13:52.558748 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="822fdb72-7e7f-441b-8ebc-178ef46cca73" containerName="extract-utilities" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.558756 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="822fdb72-7e7f-441b-8ebc-178ef46cca73" containerName="extract-utilities" Mar 08 00:13:52 crc kubenswrapper[4713]: E0308 00:13:52.558767 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e570b68-8b4c-42e3-839d-f37943999246" containerName="marketplace-operator" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.558775 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e570b68-8b4c-42e3-839d-f37943999246" containerName="marketplace-operator" Mar 08 00:13:52 crc kubenswrapper[4713]: E0308 00:13:52.558795 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" containerName="extract-content" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.558803 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" containerName="extract-content" Mar 08 00:13:52 crc kubenswrapper[4713]: E0308 00:13:52.558814 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40864d72-e137-478e-8340-8c0f107b4c60" containerName="extract-content" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.558840 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="40864d72-e137-478e-8340-8c0f107b4c60" containerName="extract-content" Mar 08 00:13:52 crc kubenswrapper[4713]: E0308 00:13:52.558853 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" containerName="extract-content" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.558860 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" containerName="extract-content" Mar 08 00:13:52 crc kubenswrapper[4713]: E0308 00:13:52.558870 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40864d72-e137-478e-8340-8c0f107b4c60" containerName="extract-utilities" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.558879 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="40864d72-e137-478e-8340-8c0f107b4c60" containerName="extract-utilities" Mar 08 00:13:52 crc kubenswrapper[4713]: E0308 00:13:52.558890 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" containerName="extract-utilities" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.558897 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" containerName="extract-utilities" Mar 08 00:13:52 crc kubenswrapper[4713]: E0308 00:13:52.558906 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40864d72-e137-478e-8340-8c0f107b4c60" containerName="registry-server" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.558913 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="40864d72-e137-478e-8340-8c0f107b4c60" containerName="registry-server" Mar 08 00:13:52 crc kubenswrapper[4713]: E0308 00:13:52.558923 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" containerName="extract-utilities" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.558931 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" containerName="extract-utilities" Mar 08 00:13:52 crc kubenswrapper[4713]: E0308 00:13:52.558940 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" containerName="registry-server" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.558947 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" containerName="registry-server" Mar 08 00:13:52 crc kubenswrapper[4713]: E0308 00:13:52.558958 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" containerName="registry-server" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.558968 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" containerName="registry-server" Mar 08 00:13:52 crc kubenswrapper[4713]: E0308 00:13:52.558981 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="822fdb72-7e7f-441b-8ebc-178ef46cca73" containerName="extract-content" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.558989 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="822fdb72-7e7f-441b-8ebc-178ef46cca73" containerName="extract-content" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.559099 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" containerName="registry-server" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.559111 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e570b68-8b4c-42e3-839d-f37943999246" containerName="marketplace-operator" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.559124 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="40864d72-e137-478e-8340-8c0f107b4c60" containerName="registry-server" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.559132 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" containerName="registry-server" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.559140 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="822fdb72-7e7f-441b-8ebc-178ef46cca73" containerName="registry-server" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.559790 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.561279 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.562900 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-4bm59" podStartSLOduration=2.562874129 podStartE2EDuration="2.562874129s" podCreationTimestamp="2026-03-08 00:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:13:52.555388493 +0000 UTC m=+486.675020726" watchObservedRunningTime="2026-03-08 00:13:52.562874129 +0000 UTC m=+486.682506372" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.582681 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4m4tz"] Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.591299 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x6gcb"] Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.594249 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x6gcb"] Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.597955 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4tj99"] Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.603562 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4tj99"] Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.608505 4713 scope.go:117] "RemoveContainer" containerID="e4df11f30a00eeb8975bf590dfcc99035d1dbd89952445cfb19e1aa26d7407f6" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.615803 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb44436e-472b-4a5f-8ff6-06242535e835-utilities\") pod \"redhat-marketplace-4m4tz\" (UID: \"cb44436e-472b-4a5f-8ff6-06242535e835\") " pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.615894 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrdd5\" (UniqueName: \"kubernetes.io/projected/cb44436e-472b-4a5f-8ff6-06242535e835-kube-api-access-mrdd5\") pod \"redhat-marketplace-4m4tz\" (UID: \"cb44436e-472b-4a5f-8ff6-06242535e835\") " pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.615943 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb44436e-472b-4a5f-8ff6-06242535e835-catalog-content\") pod \"redhat-marketplace-4m4tz\" (UID: \"cb44436e-472b-4a5f-8ff6-06242535e835\") " pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.624693 4713 scope.go:117] "RemoveContainer" containerID="46ee2fecb258f3bbeadd642b9e3423768d2062de8a5dd3a187b3ace78fd14497" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.640440 4713 scope.go:117] "RemoveContainer" containerID="b521ece8028ebf9207946445f9aecae87b7e5c6d252fd707c34dc0276256c2c0" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.654483 4713 scope.go:117] "RemoveContainer" containerID="99dd020645e7b6695acb2f758f9b98023643a329f5c7e44db6eec7c1278babd6" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.671137 4713 scope.go:117] "RemoveContainer" containerID="c0124cd1b5219c688a51426a00c55773b87427b1a16957ad745e3fd3a1ca06b1" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.685977 4713 scope.go:117] "RemoveContainer" containerID="e4404a3c0caa01e5acd1c3db2a69f4b96b4d1f768431d32a330b55a8351235db" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.717547 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb44436e-472b-4a5f-8ff6-06242535e835-utilities\") pod \"redhat-marketplace-4m4tz\" (UID: \"cb44436e-472b-4a5f-8ff6-06242535e835\") " pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.717624 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrdd5\" (UniqueName: \"kubernetes.io/projected/cb44436e-472b-4a5f-8ff6-06242535e835-kube-api-access-mrdd5\") pod \"redhat-marketplace-4m4tz\" (UID: \"cb44436e-472b-4a5f-8ff6-06242535e835\") " pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.717660 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb44436e-472b-4a5f-8ff6-06242535e835-catalog-content\") pod \"redhat-marketplace-4m4tz\" (UID: \"cb44436e-472b-4a5f-8ff6-06242535e835\") " pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.718117 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb44436e-472b-4a5f-8ff6-06242535e835-catalog-content\") pod \"redhat-marketplace-4m4tz\" (UID: \"cb44436e-472b-4a5f-8ff6-06242535e835\") " pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.718179 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb44436e-472b-4a5f-8ff6-06242535e835-utilities\") pod \"redhat-marketplace-4m4tz\" (UID: \"cb44436e-472b-4a5f-8ff6-06242535e835\") " pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.736184 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrdd5\" (UniqueName: \"kubernetes.io/projected/cb44436e-472b-4a5f-8ff6-06242535e835-kube-api-access-mrdd5\") pod \"redhat-marketplace-4m4tz\" (UID: \"cb44436e-472b-4a5f-8ff6-06242535e835\") " pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.903189 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:13:53 crc kubenswrapper[4713]: I0308 00:13:53.285286 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4m4tz"] Mar 08 00:13:53 crc kubenswrapper[4713]: I0308 00:13:53.507318 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4m4tz" event={"ID":"cb44436e-472b-4a5f-8ff6-06242535e835","Type":"ContainerStarted","Data":"872b442fcf53dc350c20c113c6415793cd135f6045c9203dc5387eb2fa9f45e6"} Mar 08 00:13:53 crc kubenswrapper[4713]: I0308 00:13:53.511108 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4bm59" Mar 08 00:13:53 crc kubenswrapper[4713]: I0308 00:13:53.513483 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-4bm59" Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.517422 4713 generic.go:334] "Generic (PLEG): container finished" podID="cb44436e-472b-4a5f-8ff6-06242535e835" containerID="b18b6fc6465b4e2a4cd841bf129ddc17aa0ded5adc8dab1c2e2a29bd980417c6" exitCode=0 Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.517520 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4m4tz" event={"ID":"cb44436e-472b-4a5f-8ff6-06242535e835","Type":"ContainerDied","Data":"b18b6fc6465b4e2a4cd841bf129ddc17aa0ded5adc8dab1c2e2a29bd980417c6"} Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.555149 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40864d72-e137-478e-8340-8c0f107b4c60" path="/var/lib/kubelet/pods/40864d72-e137-478e-8340-8c0f107b4c60/volumes" Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.555760 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" path="/var/lib/kubelet/pods/d9341928-7a63-4190-ac37-ac9ba3320e18/volumes" Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.747887 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rc7p9"] Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.749072 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rc7p9" Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.751223 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.752385 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rc7p9"] Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.847409 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd52d225-2e7e-4958-98fc-52028b545353-utilities\") pod \"community-operators-rc7p9\" (UID: \"dd52d225-2e7e-4958-98fc-52028b545353\") " pod="openshift-marketplace/community-operators-rc7p9" Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.847511 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mz5n\" (UniqueName: \"kubernetes.io/projected/dd52d225-2e7e-4958-98fc-52028b545353-kube-api-access-6mz5n\") pod \"community-operators-rc7p9\" (UID: \"dd52d225-2e7e-4958-98fc-52028b545353\") " pod="openshift-marketplace/community-operators-rc7p9" Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.847659 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd52d225-2e7e-4958-98fc-52028b545353-catalog-content\") pod \"community-operators-rc7p9\" (UID: \"dd52d225-2e7e-4958-98fc-52028b545353\") " pod="openshift-marketplace/community-operators-rc7p9" Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.946651 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mn4rt"] Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.949076 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd52d225-2e7e-4958-98fc-52028b545353-utilities\") pod \"community-operators-rc7p9\" (UID: \"dd52d225-2e7e-4958-98fc-52028b545353\") " pod="openshift-marketplace/community-operators-rc7p9" Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.949127 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mz5n\" (UniqueName: \"kubernetes.io/projected/dd52d225-2e7e-4958-98fc-52028b545353-kube-api-access-6mz5n\") pod \"community-operators-rc7p9\" (UID: \"dd52d225-2e7e-4958-98fc-52028b545353\") " pod="openshift-marketplace/community-operators-rc7p9" Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.949168 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd52d225-2e7e-4958-98fc-52028b545353-catalog-content\") pod \"community-operators-rc7p9\" (UID: \"dd52d225-2e7e-4958-98fc-52028b545353\") " pod="openshift-marketplace/community-operators-rc7p9" Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.949674 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd52d225-2e7e-4958-98fc-52028b545353-catalog-content\") pod \"community-operators-rc7p9\" (UID: \"dd52d225-2e7e-4958-98fc-52028b545353\") " pod="openshift-marketplace/community-operators-rc7p9" Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.949951 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd52d225-2e7e-4958-98fc-52028b545353-utilities\") pod \"community-operators-rc7p9\" (UID: \"dd52d225-2e7e-4958-98fc-52028b545353\") " pod="openshift-marketplace/community-operators-rc7p9" Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.950199 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mn4rt" Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.952638 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.958561 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mn4rt"] Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.978118 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mz5n\" (UniqueName: \"kubernetes.io/projected/dd52d225-2e7e-4958-98fc-52028b545353-kube-api-access-6mz5n\") pod \"community-operators-rc7p9\" (UID: \"dd52d225-2e7e-4958-98fc-52028b545353\") " pod="openshift-marketplace/community-operators-rc7p9" Mar 08 00:13:55 crc kubenswrapper[4713]: I0308 00:13:55.050153 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb2jx\" (UniqueName: \"kubernetes.io/projected/ce49dca5-e07d-416e-a72d-281928ff343b-kube-api-access-fb2jx\") pod \"certified-operators-mn4rt\" (UID: \"ce49dca5-e07d-416e-a72d-281928ff343b\") " pod="openshift-marketplace/certified-operators-mn4rt" Mar 08 00:13:55 crc kubenswrapper[4713]: I0308 00:13:55.050226 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce49dca5-e07d-416e-a72d-281928ff343b-utilities\") pod \"certified-operators-mn4rt\" (UID: \"ce49dca5-e07d-416e-a72d-281928ff343b\") " pod="openshift-marketplace/certified-operators-mn4rt" Mar 08 00:13:55 crc kubenswrapper[4713]: I0308 00:13:55.050281 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce49dca5-e07d-416e-a72d-281928ff343b-catalog-content\") pod \"certified-operators-mn4rt\" (UID: \"ce49dca5-e07d-416e-a72d-281928ff343b\") " pod="openshift-marketplace/certified-operators-mn4rt" Mar 08 00:13:55 crc kubenswrapper[4713]: I0308 00:13:55.067576 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rc7p9" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:55.152013 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb2jx\" (UniqueName: \"kubernetes.io/projected/ce49dca5-e07d-416e-a72d-281928ff343b-kube-api-access-fb2jx\") pod \"certified-operators-mn4rt\" (UID: \"ce49dca5-e07d-416e-a72d-281928ff343b\") " pod="openshift-marketplace/certified-operators-mn4rt" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:55.152408 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce49dca5-e07d-416e-a72d-281928ff343b-utilities\") pod \"certified-operators-mn4rt\" (UID: \"ce49dca5-e07d-416e-a72d-281928ff343b\") " pod="openshift-marketplace/certified-operators-mn4rt" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:55.152477 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce49dca5-e07d-416e-a72d-281928ff343b-catalog-content\") pod \"certified-operators-mn4rt\" (UID: \"ce49dca5-e07d-416e-a72d-281928ff343b\") " pod="openshift-marketplace/certified-operators-mn4rt" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:55.153040 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce49dca5-e07d-416e-a72d-281928ff343b-utilities\") pod \"certified-operators-mn4rt\" (UID: \"ce49dca5-e07d-416e-a72d-281928ff343b\") " pod="openshift-marketplace/certified-operators-mn4rt" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:55.153162 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce49dca5-e07d-416e-a72d-281928ff343b-catalog-content\") pod \"certified-operators-mn4rt\" (UID: \"ce49dca5-e07d-416e-a72d-281928ff343b\") " pod="openshift-marketplace/certified-operators-mn4rt" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:55.168689 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb2jx\" (UniqueName: \"kubernetes.io/projected/ce49dca5-e07d-416e-a72d-281928ff343b-kube-api-access-fb2jx\") pod \"certified-operators-mn4rt\" (UID: \"ce49dca5-e07d-416e-a72d-281928ff343b\") " pod="openshift-marketplace/certified-operators-mn4rt" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:55.266061 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mn4rt" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:56.527957 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4m4tz" event={"ID":"cb44436e-472b-4a5f-8ff6-06242535e835","Type":"ContainerStarted","Data":"dab489fb584fb93c45f36cb3360d36facce6eecc130f0b5f47a63f807f173b87"} Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.149434 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4b75j"] Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.151271 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4b75j" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.156085 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.169000 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4b75j"] Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.275759 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47027c84-0848-4140-bed0-b04f627cf6da-catalog-content\") pod \"redhat-operators-4b75j\" (UID: \"47027c84-0848-4140-bed0-b04f627cf6da\") " pod="openshift-marketplace/redhat-operators-4b75j" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.275848 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9pgm\" (UniqueName: \"kubernetes.io/projected/47027c84-0848-4140-bed0-b04f627cf6da-kube-api-access-s9pgm\") pod \"redhat-operators-4b75j\" (UID: \"47027c84-0848-4140-bed0-b04f627cf6da\") " pod="openshift-marketplace/redhat-operators-4b75j" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.275885 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47027c84-0848-4140-bed0-b04f627cf6da-utilities\") pod \"redhat-operators-4b75j\" (UID: \"47027c84-0848-4140-bed0-b04f627cf6da\") " pod="openshift-marketplace/redhat-operators-4b75j" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.376807 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47027c84-0848-4140-bed0-b04f627cf6da-catalog-content\") pod \"redhat-operators-4b75j\" (UID: \"47027c84-0848-4140-bed0-b04f627cf6da\") " pod="openshift-marketplace/redhat-operators-4b75j" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.376899 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9pgm\" (UniqueName: \"kubernetes.io/projected/47027c84-0848-4140-bed0-b04f627cf6da-kube-api-access-s9pgm\") pod \"redhat-operators-4b75j\" (UID: \"47027c84-0848-4140-bed0-b04f627cf6da\") " pod="openshift-marketplace/redhat-operators-4b75j" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.376933 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47027c84-0848-4140-bed0-b04f627cf6da-utilities\") pod \"redhat-operators-4b75j\" (UID: \"47027c84-0848-4140-bed0-b04f627cf6da\") " pod="openshift-marketplace/redhat-operators-4b75j" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.377365 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47027c84-0848-4140-bed0-b04f627cf6da-catalog-content\") pod \"redhat-operators-4b75j\" (UID: \"47027c84-0848-4140-bed0-b04f627cf6da\") " pod="openshift-marketplace/redhat-operators-4b75j" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.377516 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47027c84-0848-4140-bed0-b04f627cf6da-utilities\") pod \"redhat-operators-4b75j\" (UID: \"47027c84-0848-4140-bed0-b04f627cf6da\") " pod="openshift-marketplace/redhat-operators-4b75j" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.398928 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9pgm\" (UniqueName: \"kubernetes.io/projected/47027c84-0848-4140-bed0-b04f627cf6da-kube-api-access-s9pgm\") pod \"redhat-operators-4b75j\" (UID: \"47027c84-0848-4140-bed0-b04f627cf6da\") " pod="openshift-marketplace/redhat-operators-4b75j" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.471764 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4b75j" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.535268 4713 generic.go:334] "Generic (PLEG): container finished" podID="cb44436e-472b-4a5f-8ff6-06242535e835" containerID="dab489fb584fb93c45f36cb3360d36facce6eecc130f0b5f47a63f807f173b87" exitCode=0 Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.535311 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4m4tz" event={"ID":"cb44436e-472b-4a5f-8ff6-06242535e835","Type":"ContainerDied","Data":"dab489fb584fb93c45f36cb3360d36facce6eecc130f0b5f47a63f807f173b87"} Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.660649 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mn4rt"] Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.671903 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rc7p9"] Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.881965 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4b75j"] Mar 08 00:13:57 crc kubenswrapper[4713]: W0308 00:13:57.945192 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47027c84_0848_4140_bed0_b04f627cf6da.slice/crio-9efc9a2ae1682110099628cdef40f19a90b3fda72d562db40a9d77a16b847985 WatchSource:0}: Error finding container 9efc9a2ae1682110099628cdef40f19a90b3fda72d562db40a9d77a16b847985: Status 404 returned error can't find the container with id 9efc9a2ae1682110099628cdef40f19a90b3fda72d562db40a9d77a16b847985 Mar 08 00:13:58 crc kubenswrapper[4713]: I0308 00:13:58.542800 4713 generic.go:334] "Generic (PLEG): container finished" podID="47027c84-0848-4140-bed0-b04f627cf6da" containerID="9a0df9293f72faa7276bbe231d291ea87223054854a62c6b1b5c4bd0259e51c3" exitCode=0 Mar 08 00:13:58 crc kubenswrapper[4713]: I0308 00:13:58.549194 4713 generic.go:334] "Generic (PLEG): container finished" podID="dd52d225-2e7e-4958-98fc-52028b545353" containerID="aadbf7018d16076dbc657aee64f072e5fa75d59cee7dfc64efd4d955bb09047f" exitCode=0 Mar 08 00:13:58 crc kubenswrapper[4713]: I0308 00:13:58.551736 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4b75j" event={"ID":"47027c84-0848-4140-bed0-b04f627cf6da","Type":"ContainerDied","Data":"9a0df9293f72faa7276bbe231d291ea87223054854a62c6b1b5c4bd0259e51c3"} Mar 08 00:13:58 crc kubenswrapper[4713]: I0308 00:13:58.551784 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4b75j" event={"ID":"47027c84-0848-4140-bed0-b04f627cf6da","Type":"ContainerStarted","Data":"9efc9a2ae1682110099628cdef40f19a90b3fda72d562db40a9d77a16b847985"} Mar 08 00:13:58 crc kubenswrapper[4713]: I0308 00:13:58.551797 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rc7p9" event={"ID":"dd52d225-2e7e-4958-98fc-52028b545353","Type":"ContainerDied","Data":"aadbf7018d16076dbc657aee64f072e5fa75d59cee7dfc64efd4d955bb09047f"} Mar 08 00:13:58 crc kubenswrapper[4713]: I0308 00:13:58.551809 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rc7p9" event={"ID":"dd52d225-2e7e-4958-98fc-52028b545353","Type":"ContainerStarted","Data":"da176d9ce501a71e28b9b129ce4463db6fd643bac822e26967dcf30bf45fd6d1"} Mar 08 00:13:58 crc kubenswrapper[4713]: I0308 00:13:58.553355 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4m4tz" event={"ID":"cb44436e-472b-4a5f-8ff6-06242535e835","Type":"ContainerStarted","Data":"23f67ec69a4a599e171c3976b9fd0c7695c610c82963361204cfa2656c4fa904"} Mar 08 00:13:58 crc kubenswrapper[4713]: I0308 00:13:58.559778 4713 generic.go:334] "Generic (PLEG): container finished" podID="ce49dca5-e07d-416e-a72d-281928ff343b" containerID="c9c8290700ae32e35f4e8c1fbafbcb84417ece9a1cd89281d73bf49c5bff9d55" exitCode=0 Mar 08 00:13:58 crc kubenswrapper[4713]: I0308 00:13:58.559832 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mn4rt" event={"ID":"ce49dca5-e07d-416e-a72d-281928ff343b","Type":"ContainerDied","Data":"c9c8290700ae32e35f4e8c1fbafbcb84417ece9a1cd89281d73bf49c5bff9d55"} Mar 08 00:13:58 crc kubenswrapper[4713]: I0308 00:13:58.559855 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mn4rt" event={"ID":"ce49dca5-e07d-416e-a72d-281928ff343b","Type":"ContainerStarted","Data":"c1baf1e2075d9562517317f11a3a8fc622ea3dc337446ca35af5596187bdc0e8"} Mar 08 00:13:58 crc kubenswrapper[4713]: I0308 00:13:58.626253 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4m4tz" podStartSLOduration=3.084133765 podStartE2EDuration="6.626229212s" podCreationTimestamp="2026-03-08 00:13:52 +0000 UTC" firstStartedPulling="2026-03-08 00:13:54.518688993 +0000 UTC m=+488.638321226" lastFinishedPulling="2026-03-08 00:13:58.06078443 +0000 UTC m=+492.180416673" observedRunningTime="2026-03-08 00:13:58.597059438 +0000 UTC m=+492.716691671" watchObservedRunningTime="2026-03-08 00:13:58.626229212 +0000 UTC m=+492.745861445" Mar 08 00:14:00 crc kubenswrapper[4713]: I0308 00:14:00.134959 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548814-v94cz"] Mar 08 00:14:00 crc kubenswrapper[4713]: I0308 00:14:00.136065 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548814-v94cz" Mar 08 00:14:00 crc kubenswrapper[4713]: I0308 00:14:00.137709 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:14:00 crc kubenswrapper[4713]: I0308 00:14:00.137961 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:14:00 crc kubenswrapper[4713]: I0308 00:14:00.138198 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:14:00 crc kubenswrapper[4713]: I0308 00:14:00.146933 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548814-v94cz"] Mar 08 00:14:00 crc kubenswrapper[4713]: I0308 00:14:00.210063 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq57s\" (UniqueName: \"kubernetes.io/projected/4a8563b5-1794-4b14-b040-5694cafd63e8-kube-api-access-lq57s\") pod \"auto-csr-approver-29548814-v94cz\" (UID: \"4a8563b5-1794-4b14-b040-5694cafd63e8\") " pod="openshift-infra/auto-csr-approver-29548814-v94cz" Mar 08 00:14:00 crc kubenswrapper[4713]: I0308 00:14:00.311166 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq57s\" (UniqueName: \"kubernetes.io/projected/4a8563b5-1794-4b14-b040-5694cafd63e8-kube-api-access-lq57s\") pod \"auto-csr-approver-29548814-v94cz\" (UID: \"4a8563b5-1794-4b14-b040-5694cafd63e8\") " pod="openshift-infra/auto-csr-approver-29548814-v94cz" Mar 08 00:14:00 crc kubenswrapper[4713]: I0308 00:14:00.342682 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq57s\" (UniqueName: \"kubernetes.io/projected/4a8563b5-1794-4b14-b040-5694cafd63e8-kube-api-access-lq57s\") pod \"auto-csr-approver-29548814-v94cz\" (UID: \"4a8563b5-1794-4b14-b040-5694cafd63e8\") " pod="openshift-infra/auto-csr-approver-29548814-v94cz" Mar 08 00:14:00 crc kubenswrapper[4713]: I0308 00:14:00.452867 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548814-v94cz" Mar 08 00:14:00 crc kubenswrapper[4713]: I0308 00:14:00.583250 4713 generic.go:334] "Generic (PLEG): container finished" podID="ce49dca5-e07d-416e-a72d-281928ff343b" containerID="205e9cc478dd42700f6421ca490ab2b0f6325662ad101bb7df497af6f7e2ab66" exitCode=0 Mar 08 00:14:00 crc kubenswrapper[4713]: I0308 00:14:00.583338 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mn4rt" event={"ID":"ce49dca5-e07d-416e-a72d-281928ff343b","Type":"ContainerDied","Data":"205e9cc478dd42700f6421ca490ab2b0f6325662ad101bb7df497af6f7e2ab66"} Mar 08 00:14:00 crc kubenswrapper[4713]: I0308 00:14:00.916213 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548814-v94cz"] Mar 08 00:14:01 crc kubenswrapper[4713]: I0308 00:14:01.590450 4713 generic.go:334] "Generic (PLEG): container finished" podID="dd52d225-2e7e-4958-98fc-52028b545353" containerID="556cce46681884709e1c392b6e28d24a72979eb3cd29aad02ebd53c0a0257993" exitCode=0 Mar 08 00:14:01 crc kubenswrapper[4713]: I0308 00:14:01.590519 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rc7p9" event={"ID":"dd52d225-2e7e-4958-98fc-52028b545353","Type":"ContainerDied","Data":"556cce46681884709e1c392b6e28d24a72979eb3cd29aad02ebd53c0a0257993"} Mar 08 00:14:01 crc kubenswrapper[4713]: I0308 00:14:01.592044 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548814-v94cz" event={"ID":"4a8563b5-1794-4b14-b040-5694cafd63e8","Type":"ContainerStarted","Data":"1533f1cf7e1b1b910b3ae26450e9cc450f0feb0a1528cae746eb3fb3e80c274d"} Mar 08 00:14:01 crc kubenswrapper[4713]: I0308 00:14:01.594333 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mn4rt" event={"ID":"ce49dca5-e07d-416e-a72d-281928ff343b","Type":"ContainerStarted","Data":"487b88ed04c9e30985de38c1060c602641dcbeb5ac418265f637727b6a07135b"} Mar 08 00:14:01 crc kubenswrapper[4713]: I0308 00:14:01.596793 4713 generic.go:334] "Generic (PLEG): container finished" podID="47027c84-0848-4140-bed0-b04f627cf6da" containerID="8df488645c07a85b328b7cb34047ade461a7d16e4ebff0de97f353a98741b972" exitCode=0 Mar 08 00:14:01 crc kubenswrapper[4713]: I0308 00:14:01.596874 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4b75j" event={"ID":"47027c84-0848-4140-bed0-b04f627cf6da","Type":"ContainerDied","Data":"8df488645c07a85b328b7cb34047ade461a7d16e4ebff0de97f353a98741b972"} Mar 08 00:14:01 crc kubenswrapper[4713]: I0308 00:14:01.648749 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mn4rt" podStartSLOduration=4.8874088239999995 podStartE2EDuration="7.648729358s" podCreationTimestamp="2026-03-08 00:13:54 +0000 UTC" firstStartedPulling="2026-03-08 00:13:58.566064126 +0000 UTC m=+492.685696359" lastFinishedPulling="2026-03-08 00:14:01.32738466 +0000 UTC m=+495.447016893" observedRunningTime="2026-03-08 00:14:01.646046717 +0000 UTC m=+495.765678970" watchObservedRunningTime="2026-03-08 00:14:01.648729358 +0000 UTC m=+495.768361591" Mar 08 00:14:02 crc kubenswrapper[4713]: I0308 00:14:02.904456 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:14:02 crc kubenswrapper[4713]: I0308 00:14:02.904775 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:14:02 crc kubenswrapper[4713]: I0308 00:14:02.950672 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:14:03 crc kubenswrapper[4713]: I0308 00:14:03.608778 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rc7p9" event={"ID":"dd52d225-2e7e-4958-98fc-52028b545353","Type":"ContainerStarted","Data":"96abd5ee2356fcc5329aa327b08fceb46b417e579b158308fd457699b9419ea4"} Mar 08 00:14:03 crc kubenswrapper[4713]: I0308 00:14:03.613610 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548814-v94cz" event={"ID":"4a8563b5-1794-4b14-b040-5694cafd63e8","Type":"ContainerStarted","Data":"dfa43747f3bb6e5dbf06700a034e142c0a3b9f2938aaade963ddcb6f4fd3fb53"} Mar 08 00:14:03 crc kubenswrapper[4713]: I0308 00:14:03.647949 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29548814-v94cz" podStartSLOduration=1.618770863 podStartE2EDuration="3.647928127s" podCreationTimestamp="2026-03-08 00:14:00 +0000 UTC" firstStartedPulling="2026-03-08 00:14:00.929198769 +0000 UTC m=+495.048831002" lastFinishedPulling="2026-03-08 00:14:02.958356043 +0000 UTC m=+497.077988266" observedRunningTime="2026-03-08 00:14:03.64611388 +0000 UTC m=+497.765746123" watchObservedRunningTime="2026-03-08 00:14:03.647928127 +0000 UTC m=+497.767560360" Mar 08 00:14:03 crc kubenswrapper[4713]: I0308 00:14:03.649539 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rc7p9" podStartSLOduration=5.170337096 podStartE2EDuration="9.649531829s" podCreationTimestamp="2026-03-08 00:13:54 +0000 UTC" firstStartedPulling="2026-03-08 00:13:58.55054455 +0000 UTC m=+492.670176783" lastFinishedPulling="2026-03-08 00:14:03.029739293 +0000 UTC m=+497.149371516" observedRunningTime="2026-03-08 00:14:03.630796618 +0000 UTC m=+497.750428861" watchObservedRunningTime="2026-03-08 00:14:03.649531829 +0000 UTC m=+497.769164092" Mar 08 00:14:03 crc kubenswrapper[4713]: I0308 00:14:03.661268 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:14:04 crc kubenswrapper[4713]: I0308 00:14:04.501961 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:14:04 crc kubenswrapper[4713]: I0308 00:14:04.502256 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:14:04 crc kubenswrapper[4713]: I0308 00:14:04.502314 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:14:04 crc kubenswrapper[4713]: I0308 00:14:04.502927 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"01a3ae60af94ae8d21eb3d737224225b18f319c8b266fff21272171a73177224"} pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 00:14:04 crc kubenswrapper[4713]: I0308 00:14:04.502983 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" containerID="cri-o://01a3ae60af94ae8d21eb3d737224225b18f319c8b266fff21272171a73177224" gracePeriod=600 Mar 08 00:14:04 crc kubenswrapper[4713]: I0308 00:14:04.626515 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4b75j" event={"ID":"47027c84-0848-4140-bed0-b04f627cf6da","Type":"ContainerStarted","Data":"5f7880248ba24ca09a433e4b3f7504ae02a28a23e62dd8888c6a6f16a95d5a69"} Mar 08 00:14:04 crc kubenswrapper[4713]: I0308 00:14:04.627781 4713 generic.go:334] "Generic (PLEG): container finished" podID="4a8563b5-1794-4b14-b040-5694cafd63e8" containerID="dfa43747f3bb6e5dbf06700a034e142c0a3b9f2938aaade963ddcb6f4fd3fb53" exitCode=0 Mar 08 00:14:04 crc kubenswrapper[4713]: I0308 00:14:04.627863 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548814-v94cz" event={"ID":"4a8563b5-1794-4b14-b040-5694cafd63e8","Type":"ContainerDied","Data":"dfa43747f3bb6e5dbf06700a034e142c0a3b9f2938aaade963ddcb6f4fd3fb53"} Mar 08 00:14:04 crc kubenswrapper[4713]: I0308 00:14:04.644429 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4b75j" podStartSLOduration=2.620818456 podStartE2EDuration="7.644412571s" podCreationTimestamp="2026-03-08 00:13:57 +0000 UTC" firstStartedPulling="2026-03-08 00:13:58.544427329 +0000 UTC m=+492.664059562" lastFinishedPulling="2026-03-08 00:14:03.568021444 +0000 UTC m=+497.687653677" observedRunningTime="2026-03-08 00:14:04.642984153 +0000 UTC m=+498.762616406" watchObservedRunningTime="2026-03-08 00:14:04.644412571 +0000 UTC m=+498.764044794" Mar 08 00:14:05 crc kubenswrapper[4713]: I0308 00:14:05.068931 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rc7p9" Mar 08 00:14:05 crc kubenswrapper[4713]: I0308 00:14:05.069220 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rc7p9" Mar 08 00:14:05 crc kubenswrapper[4713]: I0308 00:14:05.267280 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mn4rt" Mar 08 00:14:05 crc kubenswrapper[4713]: I0308 00:14:05.267625 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mn4rt" Mar 08 00:14:05 crc kubenswrapper[4713]: I0308 00:14:05.301035 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mn4rt" Mar 08 00:14:05 crc kubenswrapper[4713]: I0308 00:14:05.635996 4713 generic.go:334] "Generic (PLEG): container finished" podID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerID="01a3ae60af94ae8d21eb3d737224225b18f319c8b266fff21272171a73177224" exitCode=0 Mar 08 00:14:05 crc kubenswrapper[4713]: I0308 00:14:05.636087 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerDied","Data":"01a3ae60af94ae8d21eb3d737224225b18f319c8b266fff21272171a73177224"} Mar 08 00:14:05 crc kubenswrapper[4713]: I0308 00:14:05.637113 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerStarted","Data":"04ebfc2302b56f8bb12a70d64fc021a3b048e8c595c42bd1150e283caea23596"} Mar 08 00:14:05 crc kubenswrapper[4713]: I0308 00:14:05.637136 4713 scope.go:117] "RemoveContainer" containerID="ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd" Mar 08 00:14:05 crc kubenswrapper[4713]: I0308 00:14:05.926242 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548814-v94cz" Mar 08 00:14:05 crc kubenswrapper[4713]: I0308 00:14:05.989883 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq57s\" (UniqueName: \"kubernetes.io/projected/4a8563b5-1794-4b14-b040-5694cafd63e8-kube-api-access-lq57s\") pod \"4a8563b5-1794-4b14-b040-5694cafd63e8\" (UID: \"4a8563b5-1794-4b14-b040-5694cafd63e8\") " Mar 08 00:14:05 crc kubenswrapper[4713]: I0308 00:14:05.994444 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a8563b5-1794-4b14-b040-5694cafd63e8-kube-api-access-lq57s" (OuterVolumeSpecName: "kube-api-access-lq57s") pod "4a8563b5-1794-4b14-b040-5694cafd63e8" (UID: "4a8563b5-1794-4b14-b040-5694cafd63e8"). InnerVolumeSpecName "kube-api-access-lq57s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:14:06 crc kubenswrapper[4713]: I0308 00:14:06.091556 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq57s\" (UniqueName: \"kubernetes.io/projected/4a8563b5-1794-4b14-b040-5694cafd63e8-kube-api-access-lq57s\") on node \"crc\" DevicePath \"\"" Mar 08 00:14:06 crc kubenswrapper[4713]: I0308 00:14:06.109178 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-rc7p9" podUID="dd52d225-2e7e-4958-98fc-52028b545353" containerName="registry-server" probeResult="failure" output=< Mar 08 00:14:06 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 08 00:14:06 crc kubenswrapper[4713]: > Mar 08 00:14:06 crc kubenswrapper[4713]: I0308 00:14:06.644749 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548814-v94cz" event={"ID":"4a8563b5-1794-4b14-b040-5694cafd63e8","Type":"ContainerDied","Data":"1533f1cf7e1b1b910b3ae26450e9cc450f0feb0a1528cae746eb3fb3e80c274d"} Mar 08 00:14:06 crc kubenswrapper[4713]: I0308 00:14:06.644795 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1533f1cf7e1b1b910b3ae26450e9cc450f0feb0a1528cae746eb3fb3e80c274d" Mar 08 00:14:06 crc kubenswrapper[4713]: I0308 00:14:06.644774 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548814-v94cz" Mar 08 00:14:06 crc kubenswrapper[4713]: I0308 00:14:06.699748 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29548808-nd57l"] Mar 08 00:14:06 crc kubenswrapper[4713]: I0308 00:14:06.703457 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29548808-nd57l"] Mar 08 00:14:07 crc kubenswrapper[4713]: I0308 00:14:07.472403 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4b75j" Mar 08 00:14:07 crc kubenswrapper[4713]: I0308 00:14:07.472692 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4b75j" Mar 08 00:14:08 crc kubenswrapper[4713]: I0308 00:14:08.511907 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4b75j" podUID="47027c84-0848-4140-bed0-b04f627cf6da" containerName="registry-server" probeResult="failure" output=< Mar 08 00:14:08 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 08 00:14:08 crc kubenswrapper[4713]: > Mar 08 00:14:08 crc kubenswrapper[4713]: I0308 00:14:08.549127 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdccd72c-79d7-4388-926e-0539c571dafe" path="/var/lib/kubelet/pods/fdccd72c-79d7-4388-926e-0539c571dafe/volumes" Mar 08 00:14:15 crc kubenswrapper[4713]: I0308 00:14:15.106412 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rc7p9" Mar 08 00:14:15 crc kubenswrapper[4713]: I0308 00:14:15.154750 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rc7p9" Mar 08 00:14:15 crc kubenswrapper[4713]: I0308 00:14:15.335696 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mn4rt" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.060960 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" podUID="c9df8d9c-b59f-4a1c-9fb4-668123290569" containerName="oauth-openshift" containerID="cri-o://6182e807253ba09b176be3aa1eed3d59dbf32b0a321c8119cab78468705d4a0d" gracePeriod=15 Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.467610 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.493435 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-service-ca\") pod \"c9df8d9c-b59f-4a1c-9fb4-668123290569\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.493549 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-serving-cert\") pod \"c9df8d9c-b59f-4a1c-9fb4-668123290569\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.493620 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-idp-0-file-data\") pod \"c9df8d9c-b59f-4a1c-9fb4-668123290569\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.493665 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-provider-selection\") pod \"c9df8d9c-b59f-4a1c-9fb4-668123290569\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.493704 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-session\") pod \"c9df8d9c-b59f-4a1c-9fb4-668123290569\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.493726 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c9df8d9c-b59f-4a1c-9fb4-668123290569-audit-dir\") pod \"c9df8d9c-b59f-4a1c-9fb4-668123290569\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.493763 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-router-certs\") pod \"c9df8d9c-b59f-4a1c-9fb4-668123290569\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.493804 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-trusted-ca-bundle\") pod \"c9df8d9c-b59f-4a1c-9fb4-668123290569\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.493850 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-audit-policies\") pod \"c9df8d9c-b59f-4a1c-9fb4-668123290569\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.493883 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-cliconfig\") pod \"c9df8d9c-b59f-4a1c-9fb4-668123290569\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.493940 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-error\") pod \"c9df8d9c-b59f-4a1c-9fb4-668123290569\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.493969 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-ocp-branding-template\") pod \"c9df8d9c-b59f-4a1c-9fb4-668123290569\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.494002 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-login\") pod \"c9df8d9c-b59f-4a1c-9fb4-668123290569\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.494045 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mp6ps\" (UniqueName: \"kubernetes.io/projected/c9df8d9c-b59f-4a1c-9fb4-668123290569-kube-api-access-mp6ps\") pod \"c9df8d9c-b59f-4a1c-9fb4-668123290569\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.494333 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "c9df8d9c-b59f-4a1c-9fb4-668123290569" (UID: "c9df8d9c-b59f-4a1c-9fb4-668123290569"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.494913 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "c9df8d9c-b59f-4a1c-9fb4-668123290569" (UID: "c9df8d9c-b59f-4a1c-9fb4-668123290569"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.495273 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "c9df8d9c-b59f-4a1c-9fb4-668123290569" (UID: "c9df8d9c-b59f-4a1c-9fb4-668123290569"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.495355 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "c9df8d9c-b59f-4a1c-9fb4-668123290569" (UID: "c9df8d9c-b59f-4a1c-9fb4-668123290569"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.495895 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9df8d9c-b59f-4a1c-9fb4-668123290569-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "c9df8d9c-b59f-4a1c-9fb4-668123290569" (UID: "c9df8d9c-b59f-4a1c-9fb4-668123290569"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.509977 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "c9df8d9c-b59f-4a1c-9fb4-668123290569" (UID: "c9df8d9c-b59f-4a1c-9fb4-668123290569"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.510413 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "c9df8d9c-b59f-4a1c-9fb4-668123290569" (UID: "c9df8d9c-b59f-4a1c-9fb4-668123290569"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.520639 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-68f4889fd8-bwpcm"] Mar 08 00:14:17 crc kubenswrapper[4713]: E0308 00:14:17.520939 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a8563b5-1794-4b14-b040-5694cafd63e8" containerName="oc" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.520958 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a8563b5-1794-4b14-b040-5694cafd63e8" containerName="oc" Mar 08 00:14:17 crc kubenswrapper[4713]: E0308 00:14:17.520975 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9df8d9c-b59f-4a1c-9fb4-668123290569" containerName="oauth-openshift" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.520983 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9df8d9c-b59f-4a1c-9fb4-668123290569" containerName="oauth-openshift" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.521101 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9df8d9c-b59f-4a1c-9fb4-668123290569" containerName="oauth-openshift" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.521115 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a8563b5-1794-4b14-b040-5694cafd63e8" containerName="oc" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.521592 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.527160 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "c9df8d9c-b59f-4a1c-9fb4-668123290569" (UID: "c9df8d9c-b59f-4a1c-9fb4-668123290569"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.527599 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "c9df8d9c-b59f-4a1c-9fb4-668123290569" (UID: "c9df8d9c-b59f-4a1c-9fb4-668123290569"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.527721 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "c9df8d9c-b59f-4a1c-9fb4-668123290569" (UID: "c9df8d9c-b59f-4a1c-9fb4-668123290569"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.527920 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "c9df8d9c-b59f-4a1c-9fb4-668123290569" (UID: "c9df8d9c-b59f-4a1c-9fb4-668123290569"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.528133 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "c9df8d9c-b59f-4a1c-9fb4-668123290569" (UID: "c9df8d9c-b59f-4a1c-9fb4-668123290569"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.528860 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "c9df8d9c-b59f-4a1c-9fb4-668123290569" (UID: "c9df8d9c-b59f-4a1c-9fb4-668123290569"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.528957 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-68f4889fd8-bwpcm"] Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.529196 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4b75j" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.536321 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9df8d9c-b59f-4a1c-9fb4-668123290569-kube-api-access-mp6ps" (OuterVolumeSpecName: "kube-api-access-mp6ps") pod "c9df8d9c-b59f-4a1c-9fb4-668123290569" (UID: "c9df8d9c-b59f-4a1c-9fb4-668123290569"). InnerVolumeSpecName "kube-api-access-mp6ps". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.578271 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4b75j" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.595311 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-session\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.595354 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-user-template-error\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.595400 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bb9e6372-a327-41fd-8d17-662579df072a-audit-policies\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.595421 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.595439 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.595459 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.595730 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.596109 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.596246 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zdcc\" (UniqueName: \"kubernetes.io/projected/bb9e6372-a327-41fd-8d17-662579df072a-kube-api-access-7zdcc\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.596278 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-user-template-login\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.596321 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bb9e6372-a327-41fd-8d17-662579df072a-audit-dir\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.596338 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-router-certs\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.596357 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.596407 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-service-ca\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.597071 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mp6ps\" (UniqueName: \"kubernetes.io/projected/c9df8d9c-b59f-4a1c-9fb4-668123290569-kube-api-access-mp6ps\") on node \"crc\" DevicePath \"\"" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.597110 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.597123 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.597133 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.597144 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.597175 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.597187 4713 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c9df8d9c-b59f-4a1c-9fb4-668123290569-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.597196 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.597208 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.597217 4713 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.597225 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.597234 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.597265 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.597275 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.698555 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-session\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.698611 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-user-template-error\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.698653 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bb9e6372-a327-41fd-8d17-662579df072a-audit-policies\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.698683 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.698709 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.698731 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.699289 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.699332 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.699374 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zdcc\" (UniqueName: \"kubernetes.io/projected/bb9e6372-a327-41fd-8d17-662579df072a-kube-api-access-7zdcc\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.699403 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bb9e6372-a327-41fd-8d17-662579df072a-audit-dir\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.699424 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-user-template-login\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.699446 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-router-certs\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.699469 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.699490 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-service-ca\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.699627 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bb9e6372-a327-41fd-8d17-662579df072a-audit-policies\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.699991 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.700066 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bb9e6372-a327-41fd-8d17-662579df072a-audit-dir\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.700182 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-service-ca\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.700517 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.702947 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.702981 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-session\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.703159 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-user-template-login\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.703266 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-user-template-error\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.703676 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.703945 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-router-certs\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.704640 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.705122 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.706887 4713 generic.go:334] "Generic (PLEG): container finished" podID="c9df8d9c-b59f-4a1c-9fb4-668123290569" containerID="6182e807253ba09b176be3aa1eed3d59dbf32b0a321c8119cab78468705d4a0d" exitCode=0 Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.706948 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.707045 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" event={"ID":"c9df8d9c-b59f-4a1c-9fb4-668123290569","Type":"ContainerDied","Data":"6182e807253ba09b176be3aa1eed3d59dbf32b0a321c8119cab78468705d4a0d"} Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.707170 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" event={"ID":"c9df8d9c-b59f-4a1c-9fb4-668123290569","Type":"ContainerDied","Data":"e0d410e7c38a223bcd0189e0430b8bd6e62ba561f8515070eac1a52a52fdb35d"} Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.707254 4713 scope.go:117] "RemoveContainer" containerID="6182e807253ba09b176be3aa1eed3d59dbf32b0a321c8119cab78468705d4a0d" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.716061 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zdcc\" (UniqueName: \"kubernetes.io/projected/bb9e6372-a327-41fd-8d17-662579df072a-kube-api-access-7zdcc\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.729176 4713 scope.go:117] "RemoveContainer" containerID="6182e807253ba09b176be3aa1eed3d59dbf32b0a321c8119cab78468705d4a0d" Mar 08 00:14:17 crc kubenswrapper[4713]: E0308 00:14:17.731501 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6182e807253ba09b176be3aa1eed3d59dbf32b0a321c8119cab78468705d4a0d\": container with ID starting with 6182e807253ba09b176be3aa1eed3d59dbf32b0a321c8119cab78468705d4a0d not found: ID does not exist" containerID="6182e807253ba09b176be3aa1eed3d59dbf32b0a321c8119cab78468705d4a0d" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.731588 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6182e807253ba09b176be3aa1eed3d59dbf32b0a321c8119cab78468705d4a0d"} err="failed to get container status \"6182e807253ba09b176be3aa1eed3d59dbf32b0a321c8119cab78468705d4a0d\": rpc error: code = NotFound desc = could not find container \"6182e807253ba09b176be3aa1eed3d59dbf32b0a321c8119cab78468705d4a0d\": container with ID starting with 6182e807253ba09b176be3aa1eed3d59dbf32b0a321c8119cab78468705d4a0d not found: ID does not exist" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.747557 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c8gbn"] Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.750077 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c8gbn"] Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.879330 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:18 crc kubenswrapper[4713]: I0308 00:14:18.306725 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-68f4889fd8-bwpcm"] Mar 08 00:14:18 crc kubenswrapper[4713]: W0308 00:14:18.312104 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb9e6372_a327_41fd_8d17_662579df072a.slice/crio-4f300ec69591808fce68228ca51b688fad8d967e01dbb656085cd2de5b20ba1b WatchSource:0}: Error finding container 4f300ec69591808fce68228ca51b688fad8d967e01dbb656085cd2de5b20ba1b: Status 404 returned error can't find the container with id 4f300ec69591808fce68228ca51b688fad8d967e01dbb656085cd2de5b20ba1b Mar 08 00:14:18 crc kubenswrapper[4713]: I0308 00:14:18.548366 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9df8d9c-b59f-4a1c-9fb4-668123290569" path="/var/lib/kubelet/pods/c9df8d9c-b59f-4a1c-9fb4-668123290569/volumes" Mar 08 00:14:18 crc kubenswrapper[4713]: I0308 00:14:18.715638 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" event={"ID":"bb9e6372-a327-41fd-8d17-662579df072a","Type":"ContainerStarted","Data":"a6e2c3505648e6e6b5ac48c44e2d593c92dff1d93421a5564c50f1e00b84de99"} Mar 08 00:14:18 crc kubenswrapper[4713]: I0308 00:14:18.716598 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" event={"ID":"bb9e6372-a327-41fd-8d17-662579df072a","Type":"ContainerStarted","Data":"4f300ec69591808fce68228ca51b688fad8d967e01dbb656085cd2de5b20ba1b"} Mar 08 00:14:18 crc kubenswrapper[4713]: I0308 00:14:18.716726 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:18 crc kubenswrapper[4713]: I0308 00:14:18.739621 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" podStartSLOduration=26.739598277 podStartE2EDuration="26.739598277s" podCreationTimestamp="2026-03-08 00:13:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:14:18.735121369 +0000 UTC m=+512.854753622" watchObservedRunningTime="2026-03-08 00:14:18.739598277 +0000 UTC m=+512.859230520" Mar 08 00:14:19 crc kubenswrapper[4713]: I0308 00:14:19.094019 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.151258 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7"] Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.152632 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7" Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.154295 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.154454 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.156996 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7"] Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.204658 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4976d892-c6f5-417a-a992-72cf7e278170-secret-volume\") pod \"collect-profiles-29548815-v44m7\" (UID: \"4976d892-c6f5-417a-a992-72cf7e278170\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7" Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.204727 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4976d892-c6f5-417a-a992-72cf7e278170-config-volume\") pod \"collect-profiles-29548815-v44m7\" (UID: \"4976d892-c6f5-417a-a992-72cf7e278170\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7" Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.204748 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj8jg\" (UniqueName: \"kubernetes.io/projected/4976d892-c6f5-417a-a992-72cf7e278170-kube-api-access-xj8jg\") pod \"collect-profiles-29548815-v44m7\" (UID: \"4976d892-c6f5-417a-a992-72cf7e278170\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7" Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.305938 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4976d892-c6f5-417a-a992-72cf7e278170-config-volume\") pod \"collect-profiles-29548815-v44m7\" (UID: \"4976d892-c6f5-417a-a992-72cf7e278170\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7" Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.306000 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj8jg\" (UniqueName: \"kubernetes.io/projected/4976d892-c6f5-417a-a992-72cf7e278170-kube-api-access-xj8jg\") pod \"collect-profiles-29548815-v44m7\" (UID: \"4976d892-c6f5-417a-a992-72cf7e278170\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7" Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.306147 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4976d892-c6f5-417a-a992-72cf7e278170-secret-volume\") pod \"collect-profiles-29548815-v44m7\" (UID: \"4976d892-c6f5-417a-a992-72cf7e278170\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7" Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.306805 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4976d892-c6f5-417a-a992-72cf7e278170-config-volume\") pod \"collect-profiles-29548815-v44m7\" (UID: \"4976d892-c6f5-417a-a992-72cf7e278170\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7" Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.316484 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4976d892-c6f5-417a-a992-72cf7e278170-secret-volume\") pod \"collect-profiles-29548815-v44m7\" (UID: \"4976d892-c6f5-417a-a992-72cf7e278170\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7" Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.321570 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj8jg\" (UniqueName: \"kubernetes.io/projected/4976d892-c6f5-417a-a992-72cf7e278170-kube-api-access-xj8jg\") pod \"collect-profiles-29548815-v44m7\" (UID: \"4976d892-c6f5-417a-a992-72cf7e278170\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7" Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.472898 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7" Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.684677 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7"] Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.937405 4713 generic.go:334] "Generic (PLEG): container finished" podID="4976d892-c6f5-417a-a992-72cf7e278170" containerID="b268d8626cb813a8937d020ece6a7ce9fef74733b7a185d9b285e8849e08f38b" exitCode=0 Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.937453 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7" event={"ID":"4976d892-c6f5-417a-a992-72cf7e278170","Type":"ContainerDied","Data":"b268d8626cb813a8937d020ece6a7ce9fef74733b7a185d9b285e8849e08f38b"} Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.937512 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7" event={"ID":"4976d892-c6f5-417a-a992-72cf7e278170","Type":"ContainerStarted","Data":"b4b06978e80e0a298a51f6db841bc5e0f775b31e800b30bc160753a3eedce122"} Mar 08 00:15:02 crc kubenswrapper[4713]: I0308 00:15:02.143195 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7" Mar 08 00:15:02 crc kubenswrapper[4713]: I0308 00:15:02.329022 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4976d892-c6f5-417a-a992-72cf7e278170-secret-volume\") pod \"4976d892-c6f5-417a-a992-72cf7e278170\" (UID: \"4976d892-c6f5-417a-a992-72cf7e278170\") " Mar 08 00:15:02 crc kubenswrapper[4713]: I0308 00:15:02.329093 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4976d892-c6f5-417a-a992-72cf7e278170-config-volume\") pod \"4976d892-c6f5-417a-a992-72cf7e278170\" (UID: \"4976d892-c6f5-417a-a992-72cf7e278170\") " Mar 08 00:15:02 crc kubenswrapper[4713]: I0308 00:15:02.329115 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj8jg\" (UniqueName: \"kubernetes.io/projected/4976d892-c6f5-417a-a992-72cf7e278170-kube-api-access-xj8jg\") pod \"4976d892-c6f5-417a-a992-72cf7e278170\" (UID: \"4976d892-c6f5-417a-a992-72cf7e278170\") " Mar 08 00:15:02 crc kubenswrapper[4713]: I0308 00:15:02.329919 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4976d892-c6f5-417a-a992-72cf7e278170-config-volume" (OuterVolumeSpecName: "config-volume") pod "4976d892-c6f5-417a-a992-72cf7e278170" (UID: "4976d892-c6f5-417a-a992-72cf7e278170"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:15:02 crc kubenswrapper[4713]: I0308 00:15:02.334306 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4976d892-c6f5-417a-a992-72cf7e278170-kube-api-access-xj8jg" (OuterVolumeSpecName: "kube-api-access-xj8jg") pod "4976d892-c6f5-417a-a992-72cf7e278170" (UID: "4976d892-c6f5-417a-a992-72cf7e278170"). InnerVolumeSpecName "kube-api-access-xj8jg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:15:02 crc kubenswrapper[4713]: I0308 00:15:02.334341 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4976d892-c6f5-417a-a992-72cf7e278170-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4976d892-c6f5-417a-a992-72cf7e278170" (UID: "4976d892-c6f5-417a-a992-72cf7e278170"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:15:02 crc kubenswrapper[4713]: I0308 00:15:02.431245 4713 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4976d892-c6f5-417a-a992-72cf7e278170-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 00:15:02 crc kubenswrapper[4713]: I0308 00:15:02.431301 4713 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4976d892-c6f5-417a-a992-72cf7e278170-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 00:15:02 crc kubenswrapper[4713]: I0308 00:15:02.431335 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xj8jg\" (UniqueName: \"kubernetes.io/projected/4976d892-c6f5-417a-a992-72cf7e278170-kube-api-access-xj8jg\") on node \"crc\" DevicePath \"\"" Mar 08 00:15:02 crc kubenswrapper[4713]: I0308 00:15:02.953254 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7" event={"ID":"4976d892-c6f5-417a-a992-72cf7e278170","Type":"ContainerDied","Data":"b4b06978e80e0a298a51f6db841bc5e0f775b31e800b30bc160753a3eedce122"} Mar 08 00:15:02 crc kubenswrapper[4713]: I0308 00:15:02.953340 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4b06978e80e0a298a51f6db841bc5e0f775b31e800b30bc160753a3eedce122" Mar 08 00:15:02 crc kubenswrapper[4713]: I0308 00:15:02.953289 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7" Mar 08 00:16:00 crc kubenswrapper[4713]: I0308 00:16:00.132669 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548816-gtsk5"] Mar 08 00:16:00 crc kubenswrapper[4713]: E0308 00:16:00.133537 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4976d892-c6f5-417a-a992-72cf7e278170" containerName="collect-profiles" Mar 08 00:16:00 crc kubenswrapper[4713]: I0308 00:16:00.133555 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="4976d892-c6f5-417a-a992-72cf7e278170" containerName="collect-profiles" Mar 08 00:16:00 crc kubenswrapper[4713]: I0308 00:16:00.133703 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="4976d892-c6f5-417a-a992-72cf7e278170" containerName="collect-profiles" Mar 08 00:16:00 crc kubenswrapper[4713]: I0308 00:16:00.134167 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548816-gtsk5" Mar 08 00:16:00 crc kubenswrapper[4713]: I0308 00:16:00.137653 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:16:00 crc kubenswrapper[4713]: I0308 00:16:00.137998 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:16:00 crc kubenswrapper[4713]: I0308 00:16:00.138118 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:16:00 crc kubenswrapper[4713]: I0308 00:16:00.151396 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548816-gtsk5"] Mar 08 00:16:00 crc kubenswrapper[4713]: I0308 00:16:00.228943 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-654jv\" (UniqueName: \"kubernetes.io/projected/e4623866-795f-438d-9b3b-66afb30f9657-kube-api-access-654jv\") pod \"auto-csr-approver-29548816-gtsk5\" (UID: \"e4623866-795f-438d-9b3b-66afb30f9657\") " pod="openshift-infra/auto-csr-approver-29548816-gtsk5" Mar 08 00:16:00 crc kubenswrapper[4713]: I0308 00:16:00.330113 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-654jv\" (UniqueName: \"kubernetes.io/projected/e4623866-795f-438d-9b3b-66afb30f9657-kube-api-access-654jv\") pod \"auto-csr-approver-29548816-gtsk5\" (UID: \"e4623866-795f-438d-9b3b-66afb30f9657\") " pod="openshift-infra/auto-csr-approver-29548816-gtsk5" Mar 08 00:16:00 crc kubenswrapper[4713]: I0308 00:16:00.350138 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-654jv\" (UniqueName: \"kubernetes.io/projected/e4623866-795f-438d-9b3b-66afb30f9657-kube-api-access-654jv\") pod \"auto-csr-approver-29548816-gtsk5\" (UID: \"e4623866-795f-438d-9b3b-66afb30f9657\") " pod="openshift-infra/auto-csr-approver-29548816-gtsk5" Mar 08 00:16:00 crc kubenswrapper[4713]: I0308 00:16:00.459032 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548816-gtsk5" Mar 08 00:16:00 crc kubenswrapper[4713]: I0308 00:16:00.637987 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548816-gtsk5"] Mar 08 00:16:00 crc kubenswrapper[4713]: I0308 00:16:00.649968 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 00:16:01 crc kubenswrapper[4713]: I0308 00:16:01.306665 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548816-gtsk5" event={"ID":"e4623866-795f-438d-9b3b-66afb30f9657","Type":"ContainerStarted","Data":"6b5888737fbdd67a29e4d77fa22d161f97bf4a7024dd7077378a96e856992b46"} Mar 08 00:16:03 crc kubenswrapper[4713]: I0308 00:16:03.323314 4713 generic.go:334] "Generic (PLEG): container finished" podID="e4623866-795f-438d-9b3b-66afb30f9657" containerID="88536119c11c7644e16e9556af63bc5f387d89253eeaf6cbd55a1eddd526755e" exitCode=0 Mar 08 00:16:03 crc kubenswrapper[4713]: I0308 00:16:03.323374 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548816-gtsk5" event={"ID":"e4623866-795f-438d-9b3b-66afb30f9657","Type":"ContainerDied","Data":"88536119c11c7644e16e9556af63bc5f387d89253eeaf6cbd55a1eddd526755e"} Mar 08 00:16:04 crc kubenswrapper[4713]: I0308 00:16:04.500911 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:16:04 crc kubenswrapper[4713]: I0308 00:16:04.500970 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:16:04 crc kubenswrapper[4713]: I0308 00:16:04.523020 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548816-gtsk5" Mar 08 00:16:04 crc kubenswrapper[4713]: I0308 00:16:04.686245 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-654jv\" (UniqueName: \"kubernetes.io/projected/e4623866-795f-438d-9b3b-66afb30f9657-kube-api-access-654jv\") pod \"e4623866-795f-438d-9b3b-66afb30f9657\" (UID: \"e4623866-795f-438d-9b3b-66afb30f9657\") " Mar 08 00:16:04 crc kubenswrapper[4713]: I0308 00:16:04.697038 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4623866-795f-438d-9b3b-66afb30f9657-kube-api-access-654jv" (OuterVolumeSpecName: "kube-api-access-654jv") pod "e4623866-795f-438d-9b3b-66afb30f9657" (UID: "e4623866-795f-438d-9b3b-66afb30f9657"). InnerVolumeSpecName "kube-api-access-654jv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:16:04 crc kubenswrapper[4713]: I0308 00:16:04.788116 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-654jv\" (UniqueName: \"kubernetes.io/projected/e4623866-795f-438d-9b3b-66afb30f9657-kube-api-access-654jv\") on node \"crc\" DevicePath \"\"" Mar 08 00:16:05 crc kubenswrapper[4713]: I0308 00:16:05.336768 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548816-gtsk5" event={"ID":"e4623866-795f-438d-9b3b-66afb30f9657","Type":"ContainerDied","Data":"6b5888737fbdd67a29e4d77fa22d161f97bf4a7024dd7077378a96e856992b46"} Mar 08 00:16:05 crc kubenswrapper[4713]: I0308 00:16:05.337039 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b5888737fbdd67a29e4d77fa22d161f97bf4a7024dd7077378a96e856992b46" Mar 08 00:16:05 crc kubenswrapper[4713]: I0308 00:16:05.336892 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548816-gtsk5" Mar 08 00:16:05 crc kubenswrapper[4713]: I0308 00:16:05.574340 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29548810-lnmdz"] Mar 08 00:16:05 crc kubenswrapper[4713]: I0308 00:16:05.577997 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29548810-lnmdz"] Mar 08 00:16:06 crc kubenswrapper[4713]: I0308 00:16:06.547731 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6470285d-4460-4c72-be17-00e880cc623d" path="/var/lib/kubelet/pods/6470285d-4460-4c72-be17-00e880cc623d/volumes" Mar 08 00:16:34 crc kubenswrapper[4713]: I0308 00:16:34.500582 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:16:34 crc kubenswrapper[4713]: I0308 00:16:34.501096 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:17:04 crc kubenswrapper[4713]: I0308 00:17:04.501208 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:17:04 crc kubenswrapper[4713]: I0308 00:17:04.501993 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:17:04 crc kubenswrapper[4713]: I0308 00:17:04.502057 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:17:04 crc kubenswrapper[4713]: I0308 00:17:04.502811 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"04ebfc2302b56f8bb12a70d64fc021a3b048e8c595c42bd1150e283caea23596"} pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 00:17:04 crc kubenswrapper[4713]: I0308 00:17:04.502949 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" containerID="cri-o://04ebfc2302b56f8bb12a70d64fc021a3b048e8c595c42bd1150e283caea23596" gracePeriod=600 Mar 08 00:17:04 crc kubenswrapper[4713]: I0308 00:17:04.682319 4713 generic.go:334] "Generic (PLEG): container finished" podID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerID="04ebfc2302b56f8bb12a70d64fc021a3b048e8c595c42bd1150e283caea23596" exitCode=0 Mar 08 00:17:04 crc kubenswrapper[4713]: I0308 00:17:04.682437 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerDied","Data":"04ebfc2302b56f8bb12a70d64fc021a3b048e8c595c42bd1150e283caea23596"} Mar 08 00:17:04 crc kubenswrapper[4713]: I0308 00:17:04.683341 4713 scope.go:117] "RemoveContainer" containerID="01a3ae60af94ae8d21eb3d737224225b18f319c8b266fff21272171a73177224" Mar 08 00:17:05 crc kubenswrapper[4713]: I0308 00:17:05.689301 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerStarted","Data":"3f58d2453dfb0789e4b6de1707b22e49490c850b97fdf881933aaed3e3ea5cb4"} Mar 08 00:17:12 crc kubenswrapper[4713]: I0308 00:17:12.761142 4713 scope.go:117] "RemoveContainer" containerID="11992517ed2080bab72a9aa961669962e2daffa5f367346a3dc9ef9010cbb913" Mar 08 00:17:12 crc kubenswrapper[4713]: I0308 00:17:12.808075 4713 scope.go:117] "RemoveContainer" containerID="1cac5b889750a3972edc99367bdaaf3ef41e15813fd86b31ba34d9a937e3a2a1" Mar 08 00:18:00 crc kubenswrapper[4713]: I0308 00:18:00.149051 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548818-c92cn"] Mar 08 00:18:00 crc kubenswrapper[4713]: E0308 00:18:00.149854 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4623866-795f-438d-9b3b-66afb30f9657" containerName="oc" Mar 08 00:18:00 crc kubenswrapper[4713]: I0308 00:18:00.149870 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4623866-795f-438d-9b3b-66afb30f9657" containerName="oc" Mar 08 00:18:00 crc kubenswrapper[4713]: I0308 00:18:00.149975 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4623866-795f-438d-9b3b-66afb30f9657" containerName="oc" Mar 08 00:18:00 crc kubenswrapper[4713]: I0308 00:18:00.151112 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548818-c92cn" Mar 08 00:18:00 crc kubenswrapper[4713]: I0308 00:18:00.154443 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:18:00 crc kubenswrapper[4713]: I0308 00:18:00.154962 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:18:00 crc kubenswrapper[4713]: I0308 00:18:00.158653 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:18:00 crc kubenswrapper[4713]: I0308 00:18:00.163368 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548818-c92cn"] Mar 08 00:18:00 crc kubenswrapper[4713]: I0308 00:18:00.190579 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv6wf\" (UniqueName: \"kubernetes.io/projected/bbf256d4-02b4-46fd-86a1-793e34a17bf5-kube-api-access-tv6wf\") pod \"auto-csr-approver-29548818-c92cn\" (UID: \"bbf256d4-02b4-46fd-86a1-793e34a17bf5\") " pod="openshift-infra/auto-csr-approver-29548818-c92cn" Mar 08 00:18:00 crc kubenswrapper[4713]: I0308 00:18:00.291931 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv6wf\" (UniqueName: \"kubernetes.io/projected/bbf256d4-02b4-46fd-86a1-793e34a17bf5-kube-api-access-tv6wf\") pod \"auto-csr-approver-29548818-c92cn\" (UID: \"bbf256d4-02b4-46fd-86a1-793e34a17bf5\") " pod="openshift-infra/auto-csr-approver-29548818-c92cn" Mar 08 00:18:00 crc kubenswrapper[4713]: I0308 00:18:00.312875 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv6wf\" (UniqueName: \"kubernetes.io/projected/bbf256d4-02b4-46fd-86a1-793e34a17bf5-kube-api-access-tv6wf\") pod \"auto-csr-approver-29548818-c92cn\" (UID: \"bbf256d4-02b4-46fd-86a1-793e34a17bf5\") " pod="openshift-infra/auto-csr-approver-29548818-c92cn" Mar 08 00:18:00 crc kubenswrapper[4713]: I0308 00:18:00.472068 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548818-c92cn" Mar 08 00:18:00 crc kubenswrapper[4713]: I0308 00:18:00.677506 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548818-c92cn"] Mar 08 00:18:01 crc kubenswrapper[4713]: I0308 00:18:01.012011 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548818-c92cn" event={"ID":"bbf256d4-02b4-46fd-86a1-793e34a17bf5","Type":"ContainerStarted","Data":"e27e7645df1ead5fd4aae04f4924dd88ade44b24c3da38f8427f022cb5a5d26d"} Mar 08 00:18:03 crc kubenswrapper[4713]: I0308 00:18:03.023672 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548818-c92cn" event={"ID":"bbf256d4-02b4-46fd-86a1-793e34a17bf5","Type":"ContainerDied","Data":"0f83288064679e56b151b6696b75672f2d4637476a38071e252b04509b88078f"} Mar 08 00:18:03 crc kubenswrapper[4713]: I0308 00:18:03.024040 4713 generic.go:334] "Generic (PLEG): container finished" podID="bbf256d4-02b4-46fd-86a1-793e34a17bf5" containerID="0f83288064679e56b151b6696b75672f2d4637476a38071e252b04509b88078f" exitCode=0 Mar 08 00:18:04 crc kubenswrapper[4713]: I0308 00:18:04.227730 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548818-c92cn" Mar 08 00:18:04 crc kubenswrapper[4713]: I0308 00:18:04.235457 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv6wf\" (UniqueName: \"kubernetes.io/projected/bbf256d4-02b4-46fd-86a1-793e34a17bf5-kube-api-access-tv6wf\") pod \"bbf256d4-02b4-46fd-86a1-793e34a17bf5\" (UID: \"bbf256d4-02b4-46fd-86a1-793e34a17bf5\") " Mar 08 00:18:04 crc kubenswrapper[4713]: I0308 00:18:04.243267 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbf256d4-02b4-46fd-86a1-793e34a17bf5-kube-api-access-tv6wf" (OuterVolumeSpecName: "kube-api-access-tv6wf") pod "bbf256d4-02b4-46fd-86a1-793e34a17bf5" (UID: "bbf256d4-02b4-46fd-86a1-793e34a17bf5"). InnerVolumeSpecName "kube-api-access-tv6wf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:18:04 crc kubenswrapper[4713]: I0308 00:18:04.336486 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tv6wf\" (UniqueName: \"kubernetes.io/projected/bbf256d4-02b4-46fd-86a1-793e34a17bf5-kube-api-access-tv6wf\") on node \"crc\" DevicePath \"\"" Mar 08 00:18:05 crc kubenswrapper[4713]: I0308 00:18:05.036731 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548818-c92cn" event={"ID":"bbf256d4-02b4-46fd-86a1-793e34a17bf5","Type":"ContainerDied","Data":"e27e7645df1ead5fd4aae04f4924dd88ade44b24c3da38f8427f022cb5a5d26d"} Mar 08 00:18:05 crc kubenswrapper[4713]: I0308 00:18:05.036777 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e27e7645df1ead5fd4aae04f4924dd88ade44b24c3da38f8427f022cb5a5d26d" Mar 08 00:18:05 crc kubenswrapper[4713]: I0308 00:18:05.036869 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548818-c92cn" Mar 08 00:18:05 crc kubenswrapper[4713]: I0308 00:18:05.284099 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29548812-24fjw"] Mar 08 00:18:05 crc kubenswrapper[4713]: I0308 00:18:05.287018 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29548812-24fjw"] Mar 08 00:18:06 crc kubenswrapper[4713]: I0308 00:18:06.554985 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12cdabef-a56e-45d2-8896-aab98bd84fb1" path="/var/lib/kubelet/pods/12cdabef-a56e-45d2-8896-aab98bd84fb1/volumes" Mar 08 00:18:12 crc kubenswrapper[4713]: I0308 00:18:12.866159 4713 scope.go:117] "RemoveContainer" containerID="71f869c9a3deae4099eb6a9e0da68e9d0801b114263bfc45efc59f3dae8002be" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.630193 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vh48p"] Mar 08 00:18:56 crc kubenswrapper[4713]: E0308 00:18:56.631128 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbf256d4-02b4-46fd-86a1-793e34a17bf5" containerName="oc" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.631141 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf256d4-02b4-46fd-86a1-793e34a17bf5" containerName="oc" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.631263 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbf256d4-02b4-46fd-86a1-793e34a17bf5" containerName="oc" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.631673 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.648261 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vh48p"] Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.725493 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ddda6293-48b1-4007-bb9c-b3657e684836-registry-certificates\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.725564 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ddda6293-48b1-4007-bb9c-b3657e684836-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.725592 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ddda6293-48b1-4007-bb9c-b3657e684836-trusted-ca\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.725619 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ddda6293-48b1-4007-bb9c-b3657e684836-registry-tls\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.725638 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ddda6293-48b1-4007-bb9c-b3657e684836-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.725672 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ddda6293-48b1-4007-bb9c-b3657e684836-bound-sa-token\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.725708 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.725739 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r78s\" (UniqueName: \"kubernetes.io/projected/ddda6293-48b1-4007-bb9c-b3657e684836-kube-api-access-2r78s\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.750901 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.826814 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ddda6293-48b1-4007-bb9c-b3657e684836-registry-certificates\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.826876 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ddda6293-48b1-4007-bb9c-b3657e684836-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.826898 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ddda6293-48b1-4007-bb9c-b3657e684836-trusted-ca\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.826919 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ddda6293-48b1-4007-bb9c-b3657e684836-registry-tls\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.826936 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ddda6293-48b1-4007-bb9c-b3657e684836-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.826963 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ddda6293-48b1-4007-bb9c-b3657e684836-bound-sa-token\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.826992 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r78s\" (UniqueName: \"kubernetes.io/projected/ddda6293-48b1-4007-bb9c-b3657e684836-kube-api-access-2r78s\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.827651 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ddda6293-48b1-4007-bb9c-b3657e684836-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.828063 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ddda6293-48b1-4007-bb9c-b3657e684836-registry-certificates\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.828986 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ddda6293-48b1-4007-bb9c-b3657e684836-trusted-ca\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.833196 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ddda6293-48b1-4007-bb9c-b3657e684836-registry-tls\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.833265 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ddda6293-48b1-4007-bb9c-b3657e684836-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.843519 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ddda6293-48b1-4007-bb9c-b3657e684836-bound-sa-token\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.847591 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r78s\" (UniqueName: \"kubernetes.io/projected/ddda6293-48b1-4007-bb9c-b3657e684836-kube-api-access-2r78s\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.948538 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:57 crc kubenswrapper[4713]: I0308 00:18:57.127524 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vh48p"] Mar 08 00:18:57 crc kubenswrapper[4713]: I0308 00:18:57.325543 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" event={"ID":"ddda6293-48b1-4007-bb9c-b3657e684836","Type":"ContainerStarted","Data":"53f962bfd47cf0b0c18eb9485e287e2a19142df7872b87cbbf68ac0e7f60a938"} Mar 08 00:18:57 crc kubenswrapper[4713]: I0308 00:18:57.325582 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" event={"ID":"ddda6293-48b1-4007-bb9c-b3657e684836","Type":"ContainerStarted","Data":"64b8203ca59c865f4bfca95896c57cb2e4bd11333fc749708ca86b77a4f880cb"} Mar 08 00:18:57 crc kubenswrapper[4713]: I0308 00:18:57.325687 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:57 crc kubenswrapper[4713]: I0308 00:18:57.349190 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" podStartSLOduration=1.349175606 podStartE2EDuration="1.349175606s" podCreationTimestamp="2026-03-08 00:18:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:18:57.349114954 +0000 UTC m=+791.468747197" watchObservedRunningTime="2026-03-08 00:18:57.349175606 +0000 UTC m=+791.468807839" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.176057 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gsfft"] Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.176867 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="nbdb" containerID="cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b" gracePeriod=30 Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.176886 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="kube-rbac-proxy-node" containerID="cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855" gracePeriod=30 Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.176996 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovn-acl-logging" containerID="cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43" gracePeriod=30 Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.177018 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="sbdb" containerID="cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078" gracePeriod=30 Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.176955 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93" gracePeriod=30 Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.177023 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovn-controller" containerID="cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0" gracePeriod=30 Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.176874 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="northd" containerID="cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864" gracePeriod=30 Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.209464 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovnkube-controller" containerID="cri-o://824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d" gracePeriod=30 Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.373017 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovnkube-controller/3.log" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.379131 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovn-acl-logging/0.log" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.379702 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovn-controller/0.log" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.380084 4713 generic.go:334] "Generic (PLEG): container finished" podID="56fbba07-87e8-4e77-b834-ed68af718d11" containerID="824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d" exitCode=0 Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.380167 4713 generic.go:334] "Generic (PLEG): container finished" podID="56fbba07-87e8-4e77-b834-ed68af718d11" containerID="dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93" exitCode=0 Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.380221 4713 generic.go:334] "Generic (PLEG): container finished" podID="56fbba07-87e8-4e77-b834-ed68af718d11" containerID="b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855" exitCode=0 Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.380274 4713 generic.go:334] "Generic (PLEG): container finished" podID="56fbba07-87e8-4e77-b834-ed68af718d11" containerID="2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43" exitCode=143 Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.380331 4713 generic.go:334] "Generic (PLEG): container finished" podID="56fbba07-87e8-4e77-b834-ed68af718d11" containerID="141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0" exitCode=143 Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.380419 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerDied","Data":"824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d"} Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.380499 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerDied","Data":"dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93"} Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.380563 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerDied","Data":"b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855"} Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.380628 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerDied","Data":"2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43"} Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.380698 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerDied","Data":"141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0"} Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.380783 4713 scope.go:117] "RemoveContainer" containerID="cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.382511 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fh96f_bf95e3f7-808b-434f-8fd4-c7e7365a1561/kube-multus/2.log" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.382889 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fh96f_bf95e3f7-808b-434f-8fd4-c7e7365a1561/kube-multus/1.log" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.382984 4713 generic.go:334] "Generic (PLEG): container finished" podID="bf95e3f7-808b-434f-8fd4-c7e7365a1561" containerID="393edc0643830d2b79626badd9377f827d4c6be3099c83edaa7aaf6132513222" exitCode=2 Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.382996 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fh96f" event={"ID":"bf95e3f7-808b-434f-8fd4-c7e7365a1561","Type":"ContainerDied","Data":"393edc0643830d2b79626badd9377f827d4c6be3099c83edaa7aaf6132513222"} Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.383558 4713 scope.go:117] "RemoveContainer" containerID="393edc0643830d2b79626badd9377f827d4c6be3099c83edaa7aaf6132513222" Mar 08 00:19:04 crc kubenswrapper[4713]: E0308 00:19:04.383905 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-fh96f_openshift-multus(bf95e3f7-808b-434f-8fd4-c7e7365a1561)\"" pod="openshift-multus/multus-fh96f" podUID="bf95e3f7-808b-434f-8fd4-c7e7365a1561" Mar 08 00:19:04 crc kubenswrapper[4713]: E0308 00:19:04.479581 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56fbba07_87e8_4e77_b834_ed68af718d11.slice/crio-2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56fbba07_87e8_4e77_b834_ed68af718d11.slice/crio-8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56fbba07_87e8_4e77_b834_ed68af718d11.slice/crio-conmon-8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864.scope\": RecentStats: unable to find data in memory cache]" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.493605 4713 scope.go:117] "RemoveContainer" containerID="889d2148380bf677798262abdd95c84d2fd000431e7c34ae8b9e128afe19e86f" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.500886 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.501029 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.515567 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovn-acl-logging/0.log" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.516036 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovn-controller/0.log" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.516484 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565075 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8g77c"] Mar 08 00:19:04 crc kubenswrapper[4713]: E0308 00:19:04.565281 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovnkube-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565292 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovnkube-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: E0308 00:19:04.565301 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="kubecfg-setup" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565307 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="kubecfg-setup" Mar 08 00:19:04 crc kubenswrapper[4713]: E0308 00:19:04.565314 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovnkube-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565320 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovnkube-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: E0308 00:19:04.565328 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovnkube-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565334 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovnkube-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: E0308 00:19:04.565341 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovnkube-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565347 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovnkube-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: E0308 00:19:04.565359 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="kube-rbac-proxy-ovn-metrics" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565365 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="kube-rbac-proxy-ovn-metrics" Mar 08 00:19:04 crc kubenswrapper[4713]: E0308 00:19:04.565373 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="kube-rbac-proxy-node" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565380 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="kube-rbac-proxy-node" Mar 08 00:19:04 crc kubenswrapper[4713]: E0308 00:19:04.565387 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovn-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565394 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovn-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: E0308 00:19:04.565404 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="nbdb" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565409 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="nbdb" Mar 08 00:19:04 crc kubenswrapper[4713]: E0308 00:19:04.565416 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovn-acl-logging" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565422 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovn-acl-logging" Mar 08 00:19:04 crc kubenswrapper[4713]: E0308 00:19:04.565429 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="sbdb" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565434 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="sbdb" Mar 08 00:19:04 crc kubenswrapper[4713]: E0308 00:19:04.565441 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="northd" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565447 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="northd" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565539 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovnkube-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565548 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovnkube-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565554 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="kube-rbac-proxy-ovn-metrics" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565562 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovnkube-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565569 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovnkube-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565577 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovn-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565586 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="northd" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565594 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="nbdb" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565601 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovnkube-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565608 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovn-acl-logging" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565615 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="sbdb" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565622 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="kube-rbac-proxy-node" Mar 08 00:19:04 crc kubenswrapper[4713]: E0308 00:19:04.565706 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovnkube-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565712 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovnkube-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.567228 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.625890 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-cni-netd\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.625932 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-var-lib-cni-networks-ovn-kubernetes\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.625947 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-run-ovn-kubernetes\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.625956 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.625973 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-log-socket\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.625990 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-node-log\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.625992 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626002 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-etc-openvswitch\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626011 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-log-socket" (OuterVolumeSpecName: "log-socket") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626021 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-ovnkube-config\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626030 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-node-log" (OuterVolumeSpecName: "node-log") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626041 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-openvswitch\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626078 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-ovnkube-script-lib\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626092 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-var-lib-openvswitch\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626089 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626130 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626083 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626288 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626400 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626423 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626465 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-ovn\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626487 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-env-overrides\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626542 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626562 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-run-netns\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626577 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-systemd\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626593 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zl27z\" (UniqueName: \"kubernetes.io/projected/56fbba07-87e8-4e77-b834-ed68af718d11-kube-api-access-zl27z\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626629 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626840 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627310 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-kubelet\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627337 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/56fbba07-87e8-4e77-b834-ed68af718d11-ovn-node-metrics-cert\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627344 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627353 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-slash\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627368 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-cni-bin\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627394 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-systemd-units\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627458 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-slash" (OuterVolumeSpecName: "host-slash") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627480 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627497 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-run-openvswitch\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627520 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7jf6\" (UniqueName: \"kubernetes.io/projected/1d4b1127-6d10-4c83-b3e9-f588af09812c-kube-api-access-d7jf6\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627543 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-run-netns\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627520 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627658 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-run-ovn\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627699 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-etc-openvswitch\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627729 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-cni-netd\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627807 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-log-socket\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627870 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627900 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-run-ovn-kubernetes\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627926 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1d4b1127-6d10-4c83-b3e9-f588af09812c-ovnkube-script-lib\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627953 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-cni-bin\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627972 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-kubelet\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628125 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-node-log\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628175 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-slash\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628241 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1d4b1127-6d10-4c83-b3e9-f588af09812c-ovn-node-metrics-cert\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628275 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-run-systemd\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628310 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1d4b1127-6d10-4c83-b3e9-f588af09812c-ovnkube-config\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628338 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-systemd-units\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628361 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-var-lib-openvswitch\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628382 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1d4b1127-6d10-4c83-b3e9-f588af09812c-env-overrides\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628444 4713 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628458 4713 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628473 4713 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628486 4713 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628498 4713 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-log-socket\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628509 4713 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-node-log\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628520 4713 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628532 4713 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628543 4713 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628554 4713 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628565 4713 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628575 4713 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628586 4713 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628597 4713 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628608 4713 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628618 4713 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628628 4713 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-slash\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.631114 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56fbba07-87e8-4e77-b834-ed68af718d11-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.631306 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56fbba07-87e8-4e77-b834-ed68af718d11-kube-api-access-zl27z" (OuterVolumeSpecName: "kube-api-access-zl27z") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "kube-api-access-zl27z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.638057 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.729859 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-node-log\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730199 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-slash\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730030 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-node-log\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730223 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1d4b1127-6d10-4c83-b3e9-f588af09812c-ovn-node-metrics-cert\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730273 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-slash\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730317 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-run-systemd\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730372 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1d4b1127-6d10-4c83-b3e9-f588af09812c-ovnkube-config\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730411 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-run-systemd\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730415 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-systemd-units\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730441 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-systemd-units\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730457 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-var-lib-openvswitch\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730482 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1d4b1127-6d10-4c83-b3e9-f588af09812c-env-overrides\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730505 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-run-openvswitch\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730534 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7jf6\" (UniqueName: \"kubernetes.io/projected/1d4b1127-6d10-4c83-b3e9-f588af09812c-kube-api-access-d7jf6\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730576 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-run-netns\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730607 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-etc-openvswitch\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730628 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-run-ovn\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730648 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-cni-netd\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730683 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-log-socket\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730712 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730744 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-run-ovn-kubernetes\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730777 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1d4b1127-6d10-4c83-b3e9-f588af09812c-ovnkube-script-lib\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730798 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-run-netns\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730812 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-cni-bin\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730866 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-cni-bin\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730865 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-etc-openvswitch\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730892 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-run-ovn\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730918 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-run-openvswitch\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730924 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-cni-netd\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730951 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-log-socket\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730984 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.731014 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-run-ovn-kubernetes\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.731423 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-kubelet\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.731434 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1d4b1127-6d10-4c83-b3e9-f588af09812c-ovnkube-config\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.731466 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-kubelet\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.731489 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-var-lib-openvswitch\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.731570 4713 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.731585 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zl27z\" (UniqueName: \"kubernetes.io/projected/56fbba07-87e8-4e77-b834-ed68af718d11-kube-api-access-zl27z\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.731597 4713 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/56fbba07-87e8-4e77-b834-ed68af718d11-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.731762 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1d4b1127-6d10-4c83-b3e9-f588af09812c-ovnkube-script-lib\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.731908 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1d4b1127-6d10-4c83-b3e9-f588af09812c-env-overrides\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.734187 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1d4b1127-6d10-4c83-b3e9-f588af09812c-ovn-node-metrics-cert\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.745322 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7jf6\" (UniqueName: \"kubernetes.io/projected/1d4b1127-6d10-4c83-b3e9-f588af09812c-kube-api-access-d7jf6\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.882150 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: W0308 00:19:04.904049 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d4b1127_6d10_4c83_b3e9_f588af09812c.slice/crio-ef3205ca25ec388a1264999823542024a220c534e27dfac0241089821b86b464 WatchSource:0}: Error finding container ef3205ca25ec388a1264999823542024a220c534e27dfac0241089821b86b464: Status 404 returned error can't find the container with id ef3205ca25ec388a1264999823542024a220c534e27dfac0241089821b86b464 Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.395765 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovn-acl-logging/0.log" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.397171 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovn-controller/0.log" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.397478 4713 generic.go:334] "Generic (PLEG): container finished" podID="56fbba07-87e8-4e77-b834-ed68af718d11" containerID="4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078" exitCode=0 Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.397567 4713 generic.go:334] "Generic (PLEG): container finished" podID="56fbba07-87e8-4e77-b834-ed68af718d11" containerID="2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b" exitCode=0 Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.397629 4713 generic.go:334] "Generic (PLEG): container finished" podID="56fbba07-87e8-4e77-b834-ed68af718d11" containerID="8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864" exitCode=0 Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.397634 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.397574 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerDied","Data":"4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078"} Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.397972 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerDied","Data":"2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b"} Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.397994 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerDied","Data":"8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864"} Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.398007 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerDied","Data":"6355753be9662030b1350e38ca6fc0620acd7ba140b99c59577d4d942dd0976d"} Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.398025 4713 scope.go:117] "RemoveContainer" containerID="824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.400722 4713 generic.go:334] "Generic (PLEG): container finished" podID="1d4b1127-6d10-4c83-b3e9-f588af09812c" containerID="3e0a22bf48247677a94d418562e87f416f360a48b70ed912f3114a78b57c2d60" exitCode=0 Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.400802 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" event={"ID":"1d4b1127-6d10-4c83-b3e9-f588af09812c","Type":"ContainerDied","Data":"3e0a22bf48247677a94d418562e87f416f360a48b70ed912f3114a78b57c2d60"} Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.400866 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" event={"ID":"1d4b1127-6d10-4c83-b3e9-f588af09812c","Type":"ContainerStarted","Data":"ef3205ca25ec388a1264999823542024a220c534e27dfac0241089821b86b464"} Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.405125 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fh96f_bf95e3f7-808b-434f-8fd4-c7e7365a1561/kube-multus/2.log" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.432039 4713 scope.go:117] "RemoveContainer" containerID="4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.462841 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gsfft"] Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.464715 4713 scope.go:117] "RemoveContainer" containerID="2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.467359 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gsfft"] Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.478167 4713 scope.go:117] "RemoveContainer" containerID="8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.498781 4713 scope.go:117] "RemoveContainer" containerID="dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.509195 4713 scope.go:117] "RemoveContainer" containerID="b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.520880 4713 scope.go:117] "RemoveContainer" containerID="2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.533774 4713 scope.go:117] "RemoveContainer" containerID="141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.557945 4713 scope.go:117] "RemoveContainer" containerID="13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.576052 4713 scope.go:117] "RemoveContainer" containerID="824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d" Mar 08 00:19:05 crc kubenswrapper[4713]: E0308 00:19:05.576489 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d\": container with ID starting with 824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d not found: ID does not exist" containerID="824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.576537 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d"} err="failed to get container status \"824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d\": rpc error: code = NotFound desc = could not find container \"824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d\": container with ID starting with 824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.576563 4713 scope.go:117] "RemoveContainer" containerID="4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078" Mar 08 00:19:05 crc kubenswrapper[4713]: E0308 00:19:05.576814 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\": container with ID starting with 4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078 not found: ID does not exist" containerID="4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.576883 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078"} err="failed to get container status \"4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\": rpc error: code = NotFound desc = could not find container \"4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\": container with ID starting with 4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.576901 4713 scope.go:117] "RemoveContainer" containerID="2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b" Mar 08 00:19:05 crc kubenswrapper[4713]: E0308 00:19:05.577156 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\": container with ID starting with 2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b not found: ID does not exist" containerID="2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.577177 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b"} err="failed to get container status \"2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\": rpc error: code = NotFound desc = could not find container \"2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\": container with ID starting with 2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.577194 4713 scope.go:117] "RemoveContainer" containerID="8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864" Mar 08 00:19:05 crc kubenswrapper[4713]: E0308 00:19:05.577641 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\": container with ID starting with 8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864 not found: ID does not exist" containerID="8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.577672 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864"} err="failed to get container status \"8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\": rpc error: code = NotFound desc = could not find container \"8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\": container with ID starting with 8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.577693 4713 scope.go:117] "RemoveContainer" containerID="dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93" Mar 08 00:19:05 crc kubenswrapper[4713]: E0308 00:19:05.577942 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\": container with ID starting with dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93 not found: ID does not exist" containerID="dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.577968 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93"} err="failed to get container status \"dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\": rpc error: code = NotFound desc = could not find container \"dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\": container with ID starting with dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.577985 4713 scope.go:117] "RemoveContainer" containerID="b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855" Mar 08 00:19:05 crc kubenswrapper[4713]: E0308 00:19:05.578237 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\": container with ID starting with b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855 not found: ID does not exist" containerID="b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.578264 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855"} err="failed to get container status \"b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\": rpc error: code = NotFound desc = could not find container \"b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\": container with ID starting with b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.578282 4713 scope.go:117] "RemoveContainer" containerID="2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43" Mar 08 00:19:05 crc kubenswrapper[4713]: E0308 00:19:05.578540 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\": container with ID starting with 2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43 not found: ID does not exist" containerID="2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.578563 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43"} err="failed to get container status \"2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\": rpc error: code = NotFound desc = could not find container \"2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\": container with ID starting with 2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.578581 4713 scope.go:117] "RemoveContainer" containerID="141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0" Mar 08 00:19:05 crc kubenswrapper[4713]: E0308 00:19:05.578955 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\": container with ID starting with 141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0 not found: ID does not exist" containerID="141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.578983 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0"} err="failed to get container status \"141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\": rpc error: code = NotFound desc = could not find container \"141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\": container with ID starting with 141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.579000 4713 scope.go:117] "RemoveContainer" containerID="13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d" Mar 08 00:19:05 crc kubenswrapper[4713]: E0308 00:19:05.579250 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\": container with ID starting with 13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d not found: ID does not exist" containerID="13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.579276 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d"} err="failed to get container status \"13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\": rpc error: code = NotFound desc = could not find container \"13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\": container with ID starting with 13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.579294 4713 scope.go:117] "RemoveContainer" containerID="824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.579656 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d"} err="failed to get container status \"824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d\": rpc error: code = NotFound desc = could not find container \"824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d\": container with ID starting with 824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.579682 4713 scope.go:117] "RemoveContainer" containerID="4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.580129 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078"} err="failed to get container status \"4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\": rpc error: code = NotFound desc = could not find container \"4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\": container with ID starting with 4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.580163 4713 scope.go:117] "RemoveContainer" containerID="2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.580648 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b"} err="failed to get container status \"2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\": rpc error: code = NotFound desc = could not find container \"2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\": container with ID starting with 2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.580681 4713 scope.go:117] "RemoveContainer" containerID="8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.580986 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864"} err="failed to get container status \"8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\": rpc error: code = NotFound desc = could not find container \"8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\": container with ID starting with 8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.581018 4713 scope.go:117] "RemoveContainer" containerID="dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.581350 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93"} err="failed to get container status \"dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\": rpc error: code = NotFound desc = could not find container \"dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\": container with ID starting with dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.581382 4713 scope.go:117] "RemoveContainer" containerID="b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.581647 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855"} err="failed to get container status \"b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\": rpc error: code = NotFound desc = could not find container \"b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\": container with ID starting with b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.581682 4713 scope.go:117] "RemoveContainer" containerID="2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.582067 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43"} err="failed to get container status \"2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\": rpc error: code = NotFound desc = could not find container \"2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\": container with ID starting with 2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.582100 4713 scope.go:117] "RemoveContainer" containerID="141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.582374 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0"} err="failed to get container status \"141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\": rpc error: code = NotFound desc = could not find container \"141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\": container with ID starting with 141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.582406 4713 scope.go:117] "RemoveContainer" containerID="13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.582723 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d"} err="failed to get container status \"13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\": rpc error: code = NotFound desc = could not find container \"13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\": container with ID starting with 13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.582750 4713 scope.go:117] "RemoveContainer" containerID="824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.583121 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d"} err="failed to get container status \"824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d\": rpc error: code = NotFound desc = could not find container \"824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d\": container with ID starting with 824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.583149 4713 scope.go:117] "RemoveContainer" containerID="4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.583409 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078"} err="failed to get container status \"4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\": rpc error: code = NotFound desc = could not find container \"4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\": container with ID starting with 4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.583436 4713 scope.go:117] "RemoveContainer" containerID="2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.583696 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b"} err="failed to get container status \"2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\": rpc error: code = NotFound desc = could not find container \"2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\": container with ID starting with 2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.583719 4713 scope.go:117] "RemoveContainer" containerID="8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.583974 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864"} err="failed to get container status \"8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\": rpc error: code = NotFound desc = could not find container \"8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\": container with ID starting with 8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.583999 4713 scope.go:117] "RemoveContainer" containerID="dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.584259 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93"} err="failed to get container status \"dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\": rpc error: code = NotFound desc = could not find container \"dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\": container with ID starting with dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.584280 4713 scope.go:117] "RemoveContainer" containerID="b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.584483 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855"} err="failed to get container status \"b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\": rpc error: code = NotFound desc = could not find container \"b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\": container with ID starting with b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.584505 4713 scope.go:117] "RemoveContainer" containerID="2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.584755 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43"} err="failed to get container status \"2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\": rpc error: code = NotFound desc = could not find container \"2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\": container with ID starting with 2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.584780 4713 scope.go:117] "RemoveContainer" containerID="141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.584995 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0"} err="failed to get container status \"141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\": rpc error: code = NotFound desc = could not find container \"141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\": container with ID starting with 141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.585019 4713 scope.go:117] "RemoveContainer" containerID="13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.585351 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d"} err="failed to get container status \"13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\": rpc error: code = NotFound desc = could not find container \"13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\": container with ID starting with 13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d not found: ID does not exist" Mar 08 00:19:06 crc kubenswrapper[4713]: I0308 00:19:06.415433 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" event={"ID":"1d4b1127-6d10-4c83-b3e9-f588af09812c","Type":"ContainerStarted","Data":"7901a062dea925d54a34042d1f82694290b94ca627c557a0fd9af9e433a01a97"} Mar 08 00:19:06 crc kubenswrapper[4713]: I0308 00:19:06.415692 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" event={"ID":"1d4b1127-6d10-4c83-b3e9-f588af09812c","Type":"ContainerStarted","Data":"69117129efda018065e7176231b21d798a0439111c11a6c53ecae2d7c8adbebe"} Mar 08 00:19:06 crc kubenswrapper[4713]: I0308 00:19:06.415704 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" event={"ID":"1d4b1127-6d10-4c83-b3e9-f588af09812c","Type":"ContainerStarted","Data":"c5746d707fcc3714d3fb41ba9ae86870afb570bb3db246f11b923d439a992674"} Mar 08 00:19:06 crc kubenswrapper[4713]: I0308 00:19:06.415715 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" event={"ID":"1d4b1127-6d10-4c83-b3e9-f588af09812c","Type":"ContainerStarted","Data":"7d35667920095de84c60802ce5f061f2ba8155950a8007ea8212448a4d4368cc"} Mar 08 00:19:06 crc kubenswrapper[4713]: I0308 00:19:06.415724 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" event={"ID":"1d4b1127-6d10-4c83-b3e9-f588af09812c","Type":"ContainerStarted","Data":"3308cba4d6d172163bf7dbe7e2ef98f12fbc51546d7f4a161d8b6e99740e1b2a"} Mar 08 00:19:06 crc kubenswrapper[4713]: I0308 00:19:06.415733 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" event={"ID":"1d4b1127-6d10-4c83-b3e9-f588af09812c","Type":"ContainerStarted","Data":"7cf9283a95da08ae58f85f219e102e1918af08f88130b0effa8d4396cd928086"} Mar 08 00:19:06 crc kubenswrapper[4713]: I0308 00:19:06.547892 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" path="/var/lib/kubelet/pods/56fbba07-87e8-4e77-b834-ed68af718d11/volumes" Mar 08 00:19:08 crc kubenswrapper[4713]: I0308 00:19:08.428559 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" event={"ID":"1d4b1127-6d10-4c83-b3e9-f588af09812c","Type":"ContainerStarted","Data":"25b677aaa77329ac51c033fd2d56c3625249138ad984ae7e49707909ba0514ca"} Mar 08 00:19:10 crc kubenswrapper[4713]: I0308 00:19:10.443359 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" event={"ID":"1d4b1127-6d10-4c83-b3e9-f588af09812c","Type":"ContainerStarted","Data":"e0ccd78bd4e9bea221c0d60a3b046309bf5139ba8beb597d15579b79e5d4fb16"} Mar 08 00:19:10 crc kubenswrapper[4713]: I0308 00:19:10.443974 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:10 crc kubenswrapper[4713]: I0308 00:19:10.443993 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:10 crc kubenswrapper[4713]: I0308 00:19:10.444006 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:10 crc kubenswrapper[4713]: I0308 00:19:10.466578 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:10 crc kubenswrapper[4713]: I0308 00:19:10.468924 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:10 crc kubenswrapper[4713]: I0308 00:19:10.473079 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" podStartSLOduration=6.473061281 podStartE2EDuration="6.473061281s" podCreationTimestamp="2026-03-08 00:19:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:19:10.471030978 +0000 UTC m=+804.590663231" watchObservedRunningTime="2026-03-08 00:19:10.473061281 +0000 UTC m=+804.592693524" Mar 08 00:19:12 crc kubenswrapper[4713]: I0308 00:19:12.922059 4713 scope.go:117] "RemoveContainer" containerID="a5ad4469ff836c615e5b2bcb96b4fe9efd7c80eb9a37dbbbc54e3aa236361f04" Mar 08 00:19:16 crc kubenswrapper[4713]: I0308 00:19:16.954201 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:19:17 crc kubenswrapper[4713]: I0308 00:19:17.003336 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bnx6n"] Mar 08 00:19:19 crc kubenswrapper[4713]: I0308 00:19:19.540300 4713 scope.go:117] "RemoveContainer" containerID="393edc0643830d2b79626badd9377f827d4c6be3099c83edaa7aaf6132513222" Mar 08 00:19:19 crc kubenswrapper[4713]: E0308 00:19:19.540494 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-fh96f_openshift-multus(bf95e3f7-808b-434f-8fd4-c7e7365a1561)\"" pod="openshift-multus/multus-fh96f" podUID="bf95e3f7-808b-434f-8fd4-c7e7365a1561" Mar 08 00:19:30 crc kubenswrapper[4713]: I0308 00:19:30.540812 4713 scope.go:117] "RemoveContainer" containerID="393edc0643830d2b79626badd9377f827d4c6be3099c83edaa7aaf6132513222" Mar 08 00:19:31 crc kubenswrapper[4713]: I0308 00:19:31.556491 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fh96f_bf95e3f7-808b-434f-8fd4-c7e7365a1561/kube-multus/2.log" Mar 08 00:19:31 crc kubenswrapper[4713]: I0308 00:19:31.557120 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fh96f" event={"ID":"bf95e3f7-808b-434f-8fd4-c7e7365a1561","Type":"ContainerStarted","Data":"4ba8c147465404e7712fc0edbf400ab1fea985cebc5927beacab6ccd5020b59c"} Mar 08 00:19:34 crc kubenswrapper[4713]: I0308 00:19:34.501166 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:19:34 crc kubenswrapper[4713]: I0308 00:19:34.501235 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:19:34 crc kubenswrapper[4713]: I0308 00:19:34.909724 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.043578 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" podUID="68a8aac8-a3d8-45c3-a4f2-6420f4740ac9" containerName="registry" containerID="cri-o://93cc0fcd69abc860cf55312dc82c20ddffc56cc57377b335880d3a97133a4aff" gracePeriod=30 Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.485461 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.546672 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-registry-certificates\") pod \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.546704 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-bound-sa-token\") pod \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.546752 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-installation-pull-secrets\") pod \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.546881 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.546933 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk5fw\" (UniqueName: \"kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-kube-api-access-gk5fw\") pod \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.546962 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-registry-tls\") pod \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.547021 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-ca-trust-extracted\") pod \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.547052 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-trusted-ca\") pod \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.547739 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.549173 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.552120 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.552530 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-kube-api-access-gk5fw" (OuterVolumeSpecName: "kube-api-access-gk5fw") pod "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9"). InnerVolumeSpecName "kube-api-access-gk5fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.554480 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.554870 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.559095 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.562931 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.610951 4713 generic.go:334] "Generic (PLEG): container finished" podID="68a8aac8-a3d8-45c3-a4f2-6420f4740ac9" containerID="93cc0fcd69abc860cf55312dc82c20ddffc56cc57377b335880d3a97133a4aff" exitCode=0 Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.610990 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" event={"ID":"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9","Type":"ContainerDied","Data":"93cc0fcd69abc860cf55312dc82c20ddffc56cc57377b335880d3a97133a4aff"} Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.611015 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" event={"ID":"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9","Type":"ContainerDied","Data":"bb5ac4f2b836df6ac588ac8b2f666d14dde9ba8adb7944edc138fe1ed9464c9d"} Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.611023 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.611029 4713 scope.go:117] "RemoveContainer" containerID="93cc0fcd69abc860cf55312dc82c20ddffc56cc57377b335880d3a97133a4aff" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.628697 4713 scope.go:117] "RemoveContainer" containerID="93cc0fcd69abc860cf55312dc82c20ddffc56cc57377b335880d3a97133a4aff" Mar 08 00:19:42 crc kubenswrapper[4713]: E0308 00:19:42.629318 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93cc0fcd69abc860cf55312dc82c20ddffc56cc57377b335880d3a97133a4aff\": container with ID starting with 93cc0fcd69abc860cf55312dc82c20ddffc56cc57377b335880d3a97133a4aff not found: ID does not exist" containerID="93cc0fcd69abc860cf55312dc82c20ddffc56cc57377b335880d3a97133a4aff" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.629359 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93cc0fcd69abc860cf55312dc82c20ddffc56cc57377b335880d3a97133a4aff"} err="failed to get container status \"93cc0fcd69abc860cf55312dc82c20ddffc56cc57377b335880d3a97133a4aff\": rpc error: code = NotFound desc = could not find container \"93cc0fcd69abc860cf55312dc82c20ddffc56cc57377b335880d3a97133a4aff\": container with ID starting with 93cc0fcd69abc860cf55312dc82c20ddffc56cc57377b335880d3a97133a4aff not found: ID does not exist" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.639367 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bnx6n"] Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.643726 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bnx6n"] Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.648651 4713 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.648683 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.648707 4713 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.648717 4713 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.648730 4713 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.648745 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gk5fw\" (UniqueName: \"kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-kube-api-access-gk5fw\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.648756 4713 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:44 crc kubenswrapper[4713]: I0308 00:19:44.546961 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68a8aac8-a3d8-45c3-a4f2-6420f4740ac9" path="/var/lib/kubelet/pods/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9/volumes" Mar 08 00:19:47 crc kubenswrapper[4713]: I0308 00:19:47.548151 4713 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.125063 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4m4tz"] Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.125777 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4m4tz" podUID="cb44436e-472b-4a5f-8ff6-06242535e835" containerName="registry-server" containerID="cri-o://23f67ec69a4a599e171c3976b9fd0c7695c610c82963361204cfa2656c4fa904" gracePeriod=30 Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.489800 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.625441 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb44436e-472b-4a5f-8ff6-06242535e835-catalog-content\") pod \"cb44436e-472b-4a5f-8ff6-06242535e835\" (UID: \"cb44436e-472b-4a5f-8ff6-06242535e835\") " Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.625523 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrdd5\" (UniqueName: \"kubernetes.io/projected/cb44436e-472b-4a5f-8ff6-06242535e835-kube-api-access-mrdd5\") pod \"cb44436e-472b-4a5f-8ff6-06242535e835\" (UID: \"cb44436e-472b-4a5f-8ff6-06242535e835\") " Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.625549 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb44436e-472b-4a5f-8ff6-06242535e835-utilities\") pod \"cb44436e-472b-4a5f-8ff6-06242535e835\" (UID: \"cb44436e-472b-4a5f-8ff6-06242535e835\") " Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.627138 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb44436e-472b-4a5f-8ff6-06242535e835-utilities" (OuterVolumeSpecName: "utilities") pod "cb44436e-472b-4a5f-8ff6-06242535e835" (UID: "cb44436e-472b-4a5f-8ff6-06242535e835"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.631356 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb44436e-472b-4a5f-8ff6-06242535e835-kube-api-access-mrdd5" (OuterVolumeSpecName: "kube-api-access-mrdd5") pod "cb44436e-472b-4a5f-8ff6-06242535e835" (UID: "cb44436e-472b-4a5f-8ff6-06242535e835"). InnerVolumeSpecName "kube-api-access-mrdd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.649465 4713 generic.go:334] "Generic (PLEG): container finished" podID="cb44436e-472b-4a5f-8ff6-06242535e835" containerID="23f67ec69a4a599e171c3976b9fd0c7695c610c82963361204cfa2656c4fa904" exitCode=0 Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.649513 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4m4tz" event={"ID":"cb44436e-472b-4a5f-8ff6-06242535e835","Type":"ContainerDied","Data":"23f67ec69a4a599e171c3976b9fd0c7695c610c82963361204cfa2656c4fa904"} Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.649563 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4m4tz" event={"ID":"cb44436e-472b-4a5f-8ff6-06242535e835","Type":"ContainerDied","Data":"872b442fcf53dc350c20c113c6415793cd135f6045c9203dc5387eb2fa9f45e6"} Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.649580 4713 scope.go:117] "RemoveContainer" containerID="23f67ec69a4a599e171c3976b9fd0c7695c610c82963361204cfa2656c4fa904" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.649526 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.652536 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb44436e-472b-4a5f-8ff6-06242535e835-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cb44436e-472b-4a5f-8ff6-06242535e835" (UID: "cb44436e-472b-4a5f-8ff6-06242535e835"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.665251 4713 scope.go:117] "RemoveContainer" containerID="dab489fb584fb93c45f36cb3360d36facce6eecc130f0b5f47a63f807f173b87" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.677704 4713 scope.go:117] "RemoveContainer" containerID="b18b6fc6465b4e2a4cd841bf129ddc17aa0ded5adc8dab1c2e2a29bd980417c6" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.692892 4713 scope.go:117] "RemoveContainer" containerID="23f67ec69a4a599e171c3976b9fd0c7695c610c82963361204cfa2656c4fa904" Mar 08 00:19:49 crc kubenswrapper[4713]: E0308 00:19:49.693432 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23f67ec69a4a599e171c3976b9fd0c7695c610c82963361204cfa2656c4fa904\": container with ID starting with 23f67ec69a4a599e171c3976b9fd0c7695c610c82963361204cfa2656c4fa904 not found: ID does not exist" containerID="23f67ec69a4a599e171c3976b9fd0c7695c610c82963361204cfa2656c4fa904" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.693459 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23f67ec69a4a599e171c3976b9fd0c7695c610c82963361204cfa2656c4fa904"} err="failed to get container status \"23f67ec69a4a599e171c3976b9fd0c7695c610c82963361204cfa2656c4fa904\": rpc error: code = NotFound desc = could not find container \"23f67ec69a4a599e171c3976b9fd0c7695c610c82963361204cfa2656c4fa904\": container with ID starting with 23f67ec69a4a599e171c3976b9fd0c7695c610c82963361204cfa2656c4fa904 not found: ID does not exist" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.693479 4713 scope.go:117] "RemoveContainer" containerID="dab489fb584fb93c45f36cb3360d36facce6eecc130f0b5f47a63f807f173b87" Mar 08 00:19:49 crc kubenswrapper[4713]: E0308 00:19:49.693849 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dab489fb584fb93c45f36cb3360d36facce6eecc130f0b5f47a63f807f173b87\": container with ID starting with dab489fb584fb93c45f36cb3360d36facce6eecc130f0b5f47a63f807f173b87 not found: ID does not exist" containerID="dab489fb584fb93c45f36cb3360d36facce6eecc130f0b5f47a63f807f173b87" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.693898 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dab489fb584fb93c45f36cb3360d36facce6eecc130f0b5f47a63f807f173b87"} err="failed to get container status \"dab489fb584fb93c45f36cb3360d36facce6eecc130f0b5f47a63f807f173b87\": rpc error: code = NotFound desc = could not find container \"dab489fb584fb93c45f36cb3360d36facce6eecc130f0b5f47a63f807f173b87\": container with ID starting with dab489fb584fb93c45f36cb3360d36facce6eecc130f0b5f47a63f807f173b87 not found: ID does not exist" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.693929 4713 scope.go:117] "RemoveContainer" containerID="b18b6fc6465b4e2a4cd841bf129ddc17aa0ded5adc8dab1c2e2a29bd980417c6" Mar 08 00:19:49 crc kubenswrapper[4713]: E0308 00:19:49.694209 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b18b6fc6465b4e2a4cd841bf129ddc17aa0ded5adc8dab1c2e2a29bd980417c6\": container with ID starting with b18b6fc6465b4e2a4cd841bf129ddc17aa0ded5adc8dab1c2e2a29bd980417c6 not found: ID does not exist" containerID="b18b6fc6465b4e2a4cd841bf129ddc17aa0ded5adc8dab1c2e2a29bd980417c6" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.694237 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b18b6fc6465b4e2a4cd841bf129ddc17aa0ded5adc8dab1c2e2a29bd980417c6"} err="failed to get container status \"b18b6fc6465b4e2a4cd841bf129ddc17aa0ded5adc8dab1c2e2a29bd980417c6\": rpc error: code = NotFound desc = could not find container \"b18b6fc6465b4e2a4cd841bf129ddc17aa0ded5adc8dab1c2e2a29bd980417c6\": container with ID starting with b18b6fc6465b4e2a4cd841bf129ddc17aa0ded5adc8dab1c2e2a29bd980417c6 not found: ID does not exist" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.727445 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb44436e-472b-4a5f-8ff6-06242535e835-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.727491 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrdd5\" (UniqueName: \"kubernetes.io/projected/cb44436e-472b-4a5f-8ff6-06242535e835-kube-api-access-mrdd5\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.727504 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb44436e-472b-4a5f-8ff6-06242535e835-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.990295 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4m4tz"] Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.998512 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4m4tz"] Mar 08 00:19:50 crc kubenswrapper[4713]: I0308 00:19:50.548891 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb44436e-472b-4a5f-8ff6-06242535e835" path="/var/lib/kubelet/pods/cb44436e-472b-4a5f-8ff6-06242535e835/volumes" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.680111 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p"] Mar 08 00:19:52 crc kubenswrapper[4713]: E0308 00:19:52.680316 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb44436e-472b-4a5f-8ff6-06242535e835" containerName="extract-utilities" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.680339 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb44436e-472b-4a5f-8ff6-06242535e835" containerName="extract-utilities" Mar 08 00:19:52 crc kubenswrapper[4713]: E0308 00:19:52.680353 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68a8aac8-a3d8-45c3-a4f2-6420f4740ac9" containerName="registry" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.680359 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="68a8aac8-a3d8-45c3-a4f2-6420f4740ac9" containerName="registry" Mar 08 00:19:52 crc kubenswrapper[4713]: E0308 00:19:52.680373 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb44436e-472b-4a5f-8ff6-06242535e835" containerName="registry-server" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.680379 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb44436e-472b-4a5f-8ff6-06242535e835" containerName="registry-server" Mar 08 00:19:52 crc kubenswrapper[4713]: E0308 00:19:52.680390 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb44436e-472b-4a5f-8ff6-06242535e835" containerName="extract-content" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.680398 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb44436e-472b-4a5f-8ff6-06242535e835" containerName="extract-content" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.680490 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="68a8aac8-a3d8-45c3-a4f2-6420f4740ac9" containerName="registry" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.680502 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb44436e-472b-4a5f-8ff6-06242535e835" containerName="registry-server" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.681258 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.684396 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.691664 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p"] Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.860141 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvwt4\" (UniqueName: \"kubernetes.io/projected/9a95188d-5e62-49d4-851d-08195ed98f4d-kube-api-access-gvwt4\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p\" (UID: \"9a95188d-5e62-49d4-851d-08195ed98f4d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.860230 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a95188d-5e62-49d4-851d-08195ed98f4d-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p\" (UID: \"9a95188d-5e62-49d4-851d-08195ed98f4d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.860602 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a95188d-5e62-49d4-851d-08195ed98f4d-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p\" (UID: \"9a95188d-5e62-49d4-851d-08195ed98f4d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.962181 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a95188d-5e62-49d4-851d-08195ed98f4d-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p\" (UID: \"9a95188d-5e62-49d4-851d-08195ed98f4d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.962254 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvwt4\" (UniqueName: \"kubernetes.io/projected/9a95188d-5e62-49d4-851d-08195ed98f4d-kube-api-access-gvwt4\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p\" (UID: \"9a95188d-5e62-49d4-851d-08195ed98f4d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.962303 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a95188d-5e62-49d4-851d-08195ed98f4d-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p\" (UID: \"9a95188d-5e62-49d4-851d-08195ed98f4d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.962740 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a95188d-5e62-49d4-851d-08195ed98f4d-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p\" (UID: \"9a95188d-5e62-49d4-851d-08195ed98f4d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.962771 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a95188d-5e62-49d4-851d-08195ed98f4d-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p\" (UID: \"9a95188d-5e62-49d4-851d-08195ed98f4d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.982147 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvwt4\" (UniqueName: \"kubernetes.io/projected/9a95188d-5e62-49d4-851d-08195ed98f4d-kube-api-access-gvwt4\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p\" (UID: \"9a95188d-5e62-49d4-851d-08195ed98f4d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" Mar 08 00:19:53 crc kubenswrapper[4713]: I0308 00:19:53.002739 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" Mar 08 00:19:53 crc kubenswrapper[4713]: I0308 00:19:53.395450 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p"] Mar 08 00:19:53 crc kubenswrapper[4713]: I0308 00:19:53.670670 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" event={"ID":"9a95188d-5e62-49d4-851d-08195ed98f4d","Type":"ContainerStarted","Data":"1ce1a1ce20772862ea12be0992aae2cea312d04841ec72c6ac661ab992251963"} Mar 08 00:19:53 crc kubenswrapper[4713]: I0308 00:19:53.671033 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" event={"ID":"9a95188d-5e62-49d4-851d-08195ed98f4d","Type":"ContainerStarted","Data":"4c86f2a4f6779fa3607ffb13f24034e849d61c6237e1b98867fba5b237c59d0d"} Mar 08 00:19:54 crc kubenswrapper[4713]: I0308 00:19:54.676623 4713 generic.go:334] "Generic (PLEG): container finished" podID="9a95188d-5e62-49d4-851d-08195ed98f4d" containerID="1ce1a1ce20772862ea12be0992aae2cea312d04841ec72c6ac661ab992251963" exitCode=0 Mar 08 00:19:54 crc kubenswrapper[4713]: I0308 00:19:54.676676 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" event={"ID":"9a95188d-5e62-49d4-851d-08195ed98f4d","Type":"ContainerDied","Data":"1ce1a1ce20772862ea12be0992aae2cea312d04841ec72c6ac661ab992251963"} Mar 08 00:19:55 crc kubenswrapper[4713]: I0308 00:19:55.850436 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z6sch"] Mar 08 00:19:55 crc kubenswrapper[4713]: I0308 00:19:55.856073 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:19:55 crc kubenswrapper[4713]: I0308 00:19:55.856502 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z6sch"] Mar 08 00:19:56 crc kubenswrapper[4713]: I0308 00:19:56.001098 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-utilities\") pod \"redhat-operators-z6sch\" (UID: \"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a\") " pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:19:56 crc kubenswrapper[4713]: I0308 00:19:56.001203 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-catalog-content\") pod \"redhat-operators-z6sch\" (UID: \"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a\") " pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:19:56 crc kubenswrapper[4713]: I0308 00:19:56.001743 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2btx\" (UniqueName: \"kubernetes.io/projected/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-kube-api-access-f2btx\") pod \"redhat-operators-z6sch\" (UID: \"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a\") " pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:19:56 crc kubenswrapper[4713]: I0308 00:19:56.103623 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-catalog-content\") pod \"redhat-operators-z6sch\" (UID: \"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a\") " pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:19:56 crc kubenswrapper[4713]: I0308 00:19:56.103965 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2btx\" (UniqueName: \"kubernetes.io/projected/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-kube-api-access-f2btx\") pod \"redhat-operators-z6sch\" (UID: \"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a\") " pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:19:56 crc kubenswrapper[4713]: I0308 00:19:56.104241 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-catalog-content\") pod \"redhat-operators-z6sch\" (UID: \"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a\") " pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:19:56 crc kubenswrapper[4713]: I0308 00:19:56.104606 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-utilities\") pod \"redhat-operators-z6sch\" (UID: \"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a\") " pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:19:56 crc kubenswrapper[4713]: I0308 00:19:56.105052 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-utilities\") pod \"redhat-operators-z6sch\" (UID: \"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a\") " pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:19:56 crc kubenswrapper[4713]: I0308 00:19:56.125028 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2btx\" (UniqueName: \"kubernetes.io/projected/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-kube-api-access-f2btx\") pod \"redhat-operators-z6sch\" (UID: \"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a\") " pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:19:56 crc kubenswrapper[4713]: I0308 00:19:56.179127 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:19:56 crc kubenswrapper[4713]: I0308 00:19:56.377160 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z6sch"] Mar 08 00:19:56 crc kubenswrapper[4713]: W0308 00:19:56.378726 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddeebc8d8_7e37_468b_a3b9_4ef9e73afb7a.slice/crio-3f5dc039938ae0039619e3673f0d3e74ed91954352f20a12f6e9005ffaa413a3 WatchSource:0}: Error finding container 3f5dc039938ae0039619e3673f0d3e74ed91954352f20a12f6e9005ffaa413a3: Status 404 returned error can't find the container with id 3f5dc039938ae0039619e3673f0d3e74ed91954352f20a12f6e9005ffaa413a3 Mar 08 00:19:56 crc kubenswrapper[4713]: I0308 00:19:56.690320 4713 generic.go:334] "Generic (PLEG): container finished" podID="9a95188d-5e62-49d4-851d-08195ed98f4d" containerID="a736c4ba1de9eee3e4e1fba600b72037c5c4ae6b13a53129cedc82690a0bf9d4" exitCode=0 Mar 08 00:19:56 crc kubenswrapper[4713]: I0308 00:19:56.690534 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" event={"ID":"9a95188d-5e62-49d4-851d-08195ed98f4d","Type":"ContainerDied","Data":"a736c4ba1de9eee3e4e1fba600b72037c5c4ae6b13a53129cedc82690a0bf9d4"} Mar 08 00:19:56 crc kubenswrapper[4713]: I0308 00:19:56.692735 4713 generic.go:334] "Generic (PLEG): container finished" podID="deebc8d8-7e37-468b-a3b9-4ef9e73afb7a" containerID="484e97f172ed4466c9f0c5c9bef702dc82ce8b64ec4b2a02f887d02e4cd3c361" exitCode=0 Mar 08 00:19:56 crc kubenswrapper[4713]: I0308 00:19:56.692763 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6sch" event={"ID":"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a","Type":"ContainerDied","Data":"484e97f172ed4466c9f0c5c9bef702dc82ce8b64ec4b2a02f887d02e4cd3c361"} Mar 08 00:19:56 crc kubenswrapper[4713]: I0308 00:19:56.692781 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6sch" event={"ID":"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a","Type":"ContainerStarted","Data":"3f5dc039938ae0039619e3673f0d3e74ed91954352f20a12f6e9005ffaa413a3"} Mar 08 00:19:57 crc kubenswrapper[4713]: I0308 00:19:57.700675 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6sch" event={"ID":"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a","Type":"ContainerStarted","Data":"8bac1d74838606ee4bfa04c4b9838c6c0bf83c1ac059419f147c5375fea2a1d8"} Mar 08 00:19:57 crc kubenswrapper[4713]: I0308 00:19:57.703079 4713 generic.go:334] "Generic (PLEG): container finished" podID="9a95188d-5e62-49d4-851d-08195ed98f4d" containerID="ecc5a233466087ba46cc571d3010af15eff315f61d103d413f967cc98b050e7f" exitCode=0 Mar 08 00:19:57 crc kubenswrapper[4713]: I0308 00:19:57.703115 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" event={"ID":"9a95188d-5e62-49d4-851d-08195ed98f4d","Type":"ContainerDied","Data":"ecc5a233466087ba46cc571d3010af15eff315f61d103d413f967cc98b050e7f"} Mar 08 00:19:58 crc kubenswrapper[4713]: I0308 00:19:58.709730 4713 generic.go:334] "Generic (PLEG): container finished" podID="deebc8d8-7e37-468b-a3b9-4ef9e73afb7a" containerID="8bac1d74838606ee4bfa04c4b9838c6c0bf83c1ac059419f147c5375fea2a1d8" exitCode=0 Mar 08 00:19:58 crc kubenswrapper[4713]: I0308 00:19:58.709860 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6sch" event={"ID":"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a","Type":"ContainerDied","Data":"8bac1d74838606ee4bfa04c4b9838c6c0bf83c1ac059419f147c5375fea2a1d8"} Mar 08 00:19:58 crc kubenswrapper[4713]: I0308 00:19:58.923330 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.037601 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a95188d-5e62-49d4-851d-08195ed98f4d-util\") pod \"9a95188d-5e62-49d4-851d-08195ed98f4d\" (UID: \"9a95188d-5e62-49d4-851d-08195ed98f4d\") " Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.037677 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvwt4\" (UniqueName: \"kubernetes.io/projected/9a95188d-5e62-49d4-851d-08195ed98f4d-kube-api-access-gvwt4\") pod \"9a95188d-5e62-49d4-851d-08195ed98f4d\" (UID: \"9a95188d-5e62-49d4-851d-08195ed98f4d\") " Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.037761 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a95188d-5e62-49d4-851d-08195ed98f4d-bundle\") pod \"9a95188d-5e62-49d4-851d-08195ed98f4d\" (UID: \"9a95188d-5e62-49d4-851d-08195ed98f4d\") " Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.040432 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a95188d-5e62-49d4-851d-08195ed98f4d-bundle" (OuterVolumeSpecName: "bundle") pod "9a95188d-5e62-49d4-851d-08195ed98f4d" (UID: "9a95188d-5e62-49d4-851d-08195ed98f4d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.043982 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a95188d-5e62-49d4-851d-08195ed98f4d-kube-api-access-gvwt4" (OuterVolumeSpecName: "kube-api-access-gvwt4") pod "9a95188d-5e62-49d4-851d-08195ed98f4d" (UID: "9a95188d-5e62-49d4-851d-08195ed98f4d"). InnerVolumeSpecName "kube-api-access-gvwt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.138912 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvwt4\" (UniqueName: \"kubernetes.io/projected/9a95188d-5e62-49d4-851d-08195ed98f4d-kube-api-access-gvwt4\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.138946 4713 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a95188d-5e62-49d4-851d-08195ed98f4d-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.231591 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a95188d-5e62-49d4-851d-08195ed98f4d-util" (OuterVolumeSpecName: "util") pod "9a95188d-5e62-49d4-851d-08195ed98f4d" (UID: "9a95188d-5e62-49d4-851d-08195ed98f4d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.239860 4713 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a95188d-5e62-49d4-851d-08195ed98f4d-util\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.732065 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.732065 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" event={"ID":"9a95188d-5e62-49d4-851d-08195ed98f4d","Type":"ContainerDied","Data":"4c86f2a4f6779fa3607ffb13f24034e849d61c6237e1b98867fba5b237c59d0d"} Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.732529 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c86f2a4f6779fa3607ffb13f24034e849d61c6237e1b98867fba5b237c59d0d" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.735152 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6sch" event={"ID":"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a","Type":"ContainerStarted","Data":"6033e578d21856f494ab38ca348bc0a1d9f9267385dd514c6d3a55d74ab8847e"} Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.757250 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z6sch" podStartSLOduration=2.36665676 podStartE2EDuration="4.757230795s" podCreationTimestamp="2026-03-08 00:19:55 +0000 UTC" firstStartedPulling="2026-03-08 00:19:56.694166255 +0000 UTC m=+850.813798488" lastFinishedPulling="2026-03-08 00:19:59.08474029 +0000 UTC m=+853.204372523" observedRunningTime="2026-03-08 00:19:59.754281198 +0000 UTC m=+853.873913451" watchObservedRunningTime="2026-03-08 00:19:59.757230795 +0000 UTC m=+853.876863028" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.886194 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p"] Mar 08 00:19:59 crc kubenswrapper[4713]: E0308 00:19:59.886411 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a95188d-5e62-49d4-851d-08195ed98f4d" containerName="extract" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.886431 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a95188d-5e62-49d4-851d-08195ed98f4d" containerName="extract" Mar 08 00:19:59 crc kubenswrapper[4713]: E0308 00:19:59.886449 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a95188d-5e62-49d4-851d-08195ed98f4d" containerName="pull" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.886458 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a95188d-5e62-49d4-851d-08195ed98f4d" containerName="pull" Mar 08 00:19:59 crc kubenswrapper[4713]: E0308 00:19:59.886472 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a95188d-5e62-49d4-851d-08195ed98f4d" containerName="util" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.886481 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a95188d-5e62-49d4-851d-08195ed98f4d" containerName="util" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.886618 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a95188d-5e62-49d4-851d-08195ed98f4d" containerName="extract" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.887446 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.889900 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.899633 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p"] Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.947331 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82947b22-2505-49f0-94e0-039a1a219656-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p\" (UID: \"82947b22-2505-49f0-94e0-039a1a219656\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.947482 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82947b22-2505-49f0-94e0-039a1a219656-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p\" (UID: \"82947b22-2505-49f0-94e0-039a1a219656\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.048982 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82947b22-2505-49f0-94e0-039a1a219656-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p\" (UID: \"82947b22-2505-49f0-94e0-039a1a219656\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.049058 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82947b22-2505-49f0-94e0-039a1a219656-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p\" (UID: \"82947b22-2505-49f0-94e0-039a1a219656\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.049096 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gqdk\" (UniqueName: \"kubernetes.io/projected/82947b22-2505-49f0-94e0-039a1a219656-kube-api-access-9gqdk\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p\" (UID: \"82947b22-2505-49f0-94e0-039a1a219656\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.049795 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82947b22-2505-49f0-94e0-039a1a219656-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p\" (UID: \"82947b22-2505-49f0-94e0-039a1a219656\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.049929 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82947b22-2505-49f0-94e0-039a1a219656-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p\" (UID: \"82947b22-2505-49f0-94e0-039a1a219656\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.134444 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548820-cts7b"] Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.135609 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548820-cts7b" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.137301 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.137445 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.137842 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.140960 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548820-cts7b"] Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.151426 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gqdk\" (UniqueName: \"kubernetes.io/projected/82947b22-2505-49f0-94e0-039a1a219656-kube-api-access-9gqdk\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p\" (UID: \"82947b22-2505-49f0-94e0-039a1a219656\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.152252 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggxxv\" (UniqueName: \"kubernetes.io/projected/8c62a3d3-0f8a-40d6-a2f0-b860e9c85085-kube-api-access-ggxxv\") pod \"auto-csr-approver-29548820-cts7b\" (UID: \"8c62a3d3-0f8a-40d6-a2f0-b860e9c85085\") " pod="openshift-infra/auto-csr-approver-29548820-cts7b" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.169076 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gqdk\" (UniqueName: \"kubernetes.io/projected/82947b22-2505-49f0-94e0-039a1a219656-kube-api-access-9gqdk\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p\" (UID: \"82947b22-2505-49f0-94e0-039a1a219656\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.244176 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.254139 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggxxv\" (UniqueName: \"kubernetes.io/projected/8c62a3d3-0f8a-40d6-a2f0-b860e9c85085-kube-api-access-ggxxv\") pod \"auto-csr-approver-29548820-cts7b\" (UID: \"8c62a3d3-0f8a-40d6-a2f0-b860e9c85085\") " pod="openshift-infra/auto-csr-approver-29548820-cts7b" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.282475 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggxxv\" (UniqueName: \"kubernetes.io/projected/8c62a3d3-0f8a-40d6-a2f0-b860e9c85085-kube-api-access-ggxxv\") pod \"auto-csr-approver-29548820-cts7b\" (UID: \"8c62a3d3-0f8a-40d6-a2f0-b860e9c85085\") " pod="openshift-infra/auto-csr-approver-29548820-cts7b" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.426377 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p"] Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.452740 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548820-cts7b" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.615565 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548820-cts7b"] Mar 08 00:20:00 crc kubenswrapper[4713]: W0308 00:20:00.623157 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c62a3d3_0f8a_40d6_a2f0_b860e9c85085.slice/crio-90b2ee71afbae434b1f5aebbc1de220ec9caec4f6f505e2a5b130e83d7ed85e9 WatchSource:0}: Error finding container 90b2ee71afbae434b1f5aebbc1de220ec9caec4f6f505e2a5b130e83d7ed85e9: Status 404 returned error can't find the container with id 90b2ee71afbae434b1f5aebbc1de220ec9caec4f6f505e2a5b130e83d7ed85e9 Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.740864 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548820-cts7b" event={"ID":"8c62a3d3-0f8a-40d6-a2f0-b860e9c85085","Type":"ContainerStarted","Data":"90b2ee71afbae434b1f5aebbc1de220ec9caec4f6f505e2a5b130e83d7ed85e9"} Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.742557 4713 generic.go:334] "Generic (PLEG): container finished" podID="82947b22-2505-49f0-94e0-039a1a219656" containerID="186c363db3b3f5848bf217802d858e513ea39f3d481d7f645c52991e2dbdc59e" exitCode=0 Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.742647 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" event={"ID":"82947b22-2505-49f0-94e0-039a1a219656","Type":"ContainerDied","Data":"186c363db3b3f5848bf217802d858e513ea39f3d481d7f645c52991e2dbdc59e"} Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.742681 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" event={"ID":"82947b22-2505-49f0-94e0-039a1a219656","Type":"ContainerStarted","Data":"5863e3fb49cb0abd48c8e4b772dd331e5b10c077db31a598dfa94396300dc6da"} Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.889143 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt"] Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.891333 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.900612 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt"] Mar 08 00:20:01 crc kubenswrapper[4713]: I0308 00:20:01.063168 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54dbca74-9530-4327-8ede-124dc50096cf-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt\" (UID: \"54dbca74-9530-4327-8ede-124dc50096cf\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" Mar 08 00:20:01 crc kubenswrapper[4713]: I0308 00:20:01.063468 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncqrm\" (UniqueName: \"kubernetes.io/projected/54dbca74-9530-4327-8ede-124dc50096cf-kube-api-access-ncqrm\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt\" (UID: \"54dbca74-9530-4327-8ede-124dc50096cf\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" Mar 08 00:20:01 crc kubenswrapper[4713]: I0308 00:20:01.063526 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54dbca74-9530-4327-8ede-124dc50096cf-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt\" (UID: \"54dbca74-9530-4327-8ede-124dc50096cf\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" Mar 08 00:20:01 crc kubenswrapper[4713]: I0308 00:20:01.165081 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54dbca74-9530-4327-8ede-124dc50096cf-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt\" (UID: \"54dbca74-9530-4327-8ede-124dc50096cf\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" Mar 08 00:20:01 crc kubenswrapper[4713]: I0308 00:20:01.165127 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncqrm\" (UniqueName: \"kubernetes.io/projected/54dbca74-9530-4327-8ede-124dc50096cf-kube-api-access-ncqrm\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt\" (UID: \"54dbca74-9530-4327-8ede-124dc50096cf\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" Mar 08 00:20:01 crc kubenswrapper[4713]: I0308 00:20:01.165173 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54dbca74-9530-4327-8ede-124dc50096cf-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt\" (UID: \"54dbca74-9530-4327-8ede-124dc50096cf\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" Mar 08 00:20:01 crc kubenswrapper[4713]: I0308 00:20:01.165547 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54dbca74-9530-4327-8ede-124dc50096cf-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt\" (UID: \"54dbca74-9530-4327-8ede-124dc50096cf\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" Mar 08 00:20:01 crc kubenswrapper[4713]: I0308 00:20:01.165669 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54dbca74-9530-4327-8ede-124dc50096cf-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt\" (UID: \"54dbca74-9530-4327-8ede-124dc50096cf\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" Mar 08 00:20:01 crc kubenswrapper[4713]: I0308 00:20:01.184085 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncqrm\" (UniqueName: \"kubernetes.io/projected/54dbca74-9530-4327-8ede-124dc50096cf-kube-api-access-ncqrm\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt\" (UID: \"54dbca74-9530-4327-8ede-124dc50096cf\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" Mar 08 00:20:01 crc kubenswrapper[4713]: I0308 00:20:01.215649 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" Mar 08 00:20:01 crc kubenswrapper[4713]: I0308 00:20:01.394945 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt"] Mar 08 00:20:01 crc kubenswrapper[4713]: I0308 00:20:01.750133 4713 generic.go:334] "Generic (PLEG): container finished" podID="54dbca74-9530-4327-8ede-124dc50096cf" containerID="3fbf74e5fa454b583c7cbbe45cb691fc6bd2392bfaf1d1ffec1a8bc6f6b3cef6" exitCode=0 Mar 08 00:20:01 crc kubenswrapper[4713]: I0308 00:20:01.750196 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" event={"ID":"54dbca74-9530-4327-8ede-124dc50096cf","Type":"ContainerDied","Data":"3fbf74e5fa454b583c7cbbe45cb691fc6bd2392bfaf1d1ffec1a8bc6f6b3cef6"} Mar 08 00:20:01 crc kubenswrapper[4713]: I0308 00:20:01.750519 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" event={"ID":"54dbca74-9530-4327-8ede-124dc50096cf","Type":"ContainerStarted","Data":"808e9e7480a81eee4107c02dd7bdc5469952f29631ae9e10215a3f95deab1629"} Mar 08 00:20:03 crc kubenswrapper[4713]: I0308 00:20:03.761927 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" event={"ID":"82947b22-2505-49f0-94e0-039a1a219656","Type":"ContainerStarted","Data":"1efd9ebff4e293a83cbc2d4395c90416eff2427e8ccd4ac0c53f176f5ead001b"} Mar 08 00:20:03 crc kubenswrapper[4713]: I0308 00:20:03.763866 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548820-cts7b" event={"ID":"8c62a3d3-0f8a-40d6-a2f0-b860e9c85085","Type":"ContainerStarted","Data":"f841e6785162901f02d099ef1f13977229ba672ec5a1c4b87a1f7c3c310267fe"} Mar 08 00:20:03 crc kubenswrapper[4713]: I0308 00:20:03.765468 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" event={"ID":"54dbca74-9530-4327-8ede-124dc50096cf","Type":"ContainerStarted","Data":"27a8018254bcdc95999268b684b7c8eefdd283d285e194688d9f530abdc16e37"} Mar 08 00:20:03 crc kubenswrapper[4713]: I0308 00:20:03.803144 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29548820-cts7b" podStartSLOduration=1.156346864 podStartE2EDuration="3.803125501s" podCreationTimestamp="2026-03-08 00:20:00 +0000 UTC" firstStartedPulling="2026-03-08 00:20:00.625168307 +0000 UTC m=+854.744800540" lastFinishedPulling="2026-03-08 00:20:03.271946944 +0000 UTC m=+857.391579177" observedRunningTime="2026-03-08 00:20:03.800390179 +0000 UTC m=+857.920022422" watchObservedRunningTime="2026-03-08 00:20:03.803125501 +0000 UTC m=+857.922757744" Mar 08 00:20:04 crc kubenswrapper[4713]: I0308 00:20:04.501518 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:20:04 crc kubenswrapper[4713]: I0308 00:20:04.501576 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:20:04 crc kubenswrapper[4713]: I0308 00:20:04.501620 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:20:04 crc kubenswrapper[4713]: I0308 00:20:04.502174 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3f58d2453dfb0789e4b6de1707b22e49490c850b97fdf881933aaed3e3ea5cb4"} pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 00:20:04 crc kubenswrapper[4713]: I0308 00:20:04.502228 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" containerID="cri-o://3f58d2453dfb0789e4b6de1707b22e49490c850b97fdf881933aaed3e3ea5cb4" gracePeriod=600 Mar 08 00:20:04 crc kubenswrapper[4713]: I0308 00:20:04.771707 4713 generic.go:334] "Generic (PLEG): container finished" podID="8c62a3d3-0f8a-40d6-a2f0-b860e9c85085" containerID="f841e6785162901f02d099ef1f13977229ba672ec5a1c4b87a1f7c3c310267fe" exitCode=0 Mar 08 00:20:04 crc kubenswrapper[4713]: I0308 00:20:04.771760 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548820-cts7b" event={"ID":"8c62a3d3-0f8a-40d6-a2f0-b860e9c85085","Type":"ContainerDied","Data":"f841e6785162901f02d099ef1f13977229ba672ec5a1c4b87a1f7c3c310267fe"} Mar 08 00:20:04 crc kubenswrapper[4713]: I0308 00:20:04.854092 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-75hx9"] Mar 08 00:20:04 crc kubenswrapper[4713]: I0308 00:20:04.855398 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:04 crc kubenswrapper[4713]: I0308 00:20:04.912812 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-75hx9"] Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.009684 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8thgw\" (UniqueName: \"kubernetes.io/projected/d36584b2-9533-4c0e-807f-247e1dbfde71-kube-api-access-8thgw\") pod \"certified-operators-75hx9\" (UID: \"d36584b2-9533-4c0e-807f-247e1dbfde71\") " pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.009729 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d36584b2-9533-4c0e-807f-247e1dbfde71-catalog-content\") pod \"certified-operators-75hx9\" (UID: \"d36584b2-9533-4c0e-807f-247e1dbfde71\") " pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.010205 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d36584b2-9533-4c0e-807f-247e1dbfde71-utilities\") pod \"certified-operators-75hx9\" (UID: \"d36584b2-9533-4c0e-807f-247e1dbfde71\") " pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.112345 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8thgw\" (UniqueName: \"kubernetes.io/projected/d36584b2-9533-4c0e-807f-247e1dbfde71-kube-api-access-8thgw\") pod \"certified-operators-75hx9\" (UID: \"d36584b2-9533-4c0e-807f-247e1dbfde71\") " pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.112410 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d36584b2-9533-4c0e-807f-247e1dbfde71-catalog-content\") pod \"certified-operators-75hx9\" (UID: \"d36584b2-9533-4c0e-807f-247e1dbfde71\") " pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.112443 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d36584b2-9533-4c0e-807f-247e1dbfde71-utilities\") pod \"certified-operators-75hx9\" (UID: \"d36584b2-9533-4c0e-807f-247e1dbfde71\") " pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.113044 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d36584b2-9533-4c0e-807f-247e1dbfde71-utilities\") pod \"certified-operators-75hx9\" (UID: \"d36584b2-9533-4c0e-807f-247e1dbfde71\") " pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.113563 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d36584b2-9533-4c0e-807f-247e1dbfde71-catalog-content\") pod \"certified-operators-75hx9\" (UID: \"d36584b2-9533-4c0e-807f-247e1dbfde71\") " pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.139388 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8thgw\" (UniqueName: \"kubernetes.io/projected/d36584b2-9533-4c0e-807f-247e1dbfde71-kube-api-access-8thgw\") pod \"certified-operators-75hx9\" (UID: \"d36584b2-9533-4c0e-807f-247e1dbfde71\") " pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.249185 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.721018 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-75hx9"] Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.781043 4713 generic.go:334] "Generic (PLEG): container finished" podID="82947b22-2505-49f0-94e0-039a1a219656" containerID="1efd9ebff4e293a83cbc2d4395c90416eff2427e8ccd4ac0c53f176f5ead001b" exitCode=0 Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.782149 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" event={"ID":"82947b22-2505-49f0-94e0-039a1a219656","Type":"ContainerDied","Data":"1efd9ebff4e293a83cbc2d4395c90416eff2427e8ccd4ac0c53f176f5ead001b"} Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.797155 4713 generic.go:334] "Generic (PLEG): container finished" podID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerID="3f58d2453dfb0789e4b6de1707b22e49490c850b97fdf881933aaed3e3ea5cb4" exitCode=0 Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.797221 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerDied","Data":"3f58d2453dfb0789e4b6de1707b22e49490c850b97fdf881933aaed3e3ea5cb4"} Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.797255 4713 scope.go:117] "RemoveContainer" containerID="04ebfc2302b56f8bb12a70d64fc021a3b048e8c595c42bd1150e283caea23596" Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.811241 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75hx9" event={"ID":"d36584b2-9533-4c0e-807f-247e1dbfde71","Type":"ContainerStarted","Data":"bbb7c668e198fab933a09095559493804adf46dd60ac7836615cd7c4aef891ab"} Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.827887 4713 generic.go:334] "Generic (PLEG): container finished" podID="54dbca74-9530-4327-8ede-124dc50096cf" containerID="27a8018254bcdc95999268b684b7c8eefdd283d285e194688d9f530abdc16e37" exitCode=0 Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.828073 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" event={"ID":"54dbca74-9530-4327-8ede-124dc50096cf","Type":"ContainerDied","Data":"27a8018254bcdc95999268b684b7c8eefdd283d285e194688d9f530abdc16e37"} Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.180665 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.180731 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.270191 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548820-cts7b" Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.287194 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.426612 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggxxv\" (UniqueName: \"kubernetes.io/projected/8c62a3d3-0f8a-40d6-a2f0-b860e9c85085-kube-api-access-ggxxv\") pod \"8c62a3d3-0f8a-40d6-a2f0-b860e9c85085\" (UID: \"8c62a3d3-0f8a-40d6-a2f0-b860e9c85085\") " Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.443170 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c62a3d3-0f8a-40d6-a2f0-b860e9c85085-kube-api-access-ggxxv" (OuterVolumeSpecName: "kube-api-access-ggxxv") pod "8c62a3d3-0f8a-40d6-a2f0-b860e9c85085" (UID: "8c62a3d3-0f8a-40d6-a2f0-b860e9c85085"). InnerVolumeSpecName "kube-api-access-ggxxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.527615 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggxxv\" (UniqueName: \"kubernetes.io/projected/8c62a3d3-0f8a-40d6-a2f0-b860e9c85085-kube-api-access-ggxxv\") on node \"crc\" DevicePath \"\"" Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.835214 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" event={"ID":"82947b22-2505-49f0-94e0-039a1a219656","Type":"ContainerStarted","Data":"8eeddca99c72da75088d7692b5518a91502014098929856d7dd903dc4f2249d0"} Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.837624 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerStarted","Data":"c05ee6e5a19168a6d6242d209054a09db1bc72634110e6c102d8134908c2acc0"} Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.839185 4713 generic.go:334] "Generic (PLEG): container finished" podID="d36584b2-9533-4c0e-807f-247e1dbfde71" containerID="637411a4d2fb86d6c5126e6739d735ba75486124da7b040143ab3e4b7241f16f" exitCode=0 Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.839242 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75hx9" event={"ID":"d36584b2-9533-4c0e-807f-247e1dbfde71","Type":"ContainerDied","Data":"637411a4d2fb86d6c5126e6739d735ba75486124da7b040143ab3e4b7241f16f"} Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.842479 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548820-cts7b" event={"ID":"8c62a3d3-0f8a-40d6-a2f0-b860e9c85085","Type":"ContainerDied","Data":"90b2ee71afbae434b1f5aebbc1de220ec9caec4f6f505e2a5b130e83d7ed85e9"} Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.842511 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548820-cts7b" Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.842520 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90b2ee71afbae434b1f5aebbc1de220ec9caec4f6f505e2a5b130e83d7ed85e9" Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.848126 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" event={"ID":"54dbca74-9530-4327-8ede-124dc50096cf","Type":"ContainerStarted","Data":"40e345edc7e613ecc12357e373e3bd98dff211d5631b1d57b9dcc6475a9fad5f"} Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.893591 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" podStartSLOduration=5.365456665 podStartE2EDuration="7.893573559s" podCreationTimestamp="2026-03-08 00:19:59 +0000 UTC" firstStartedPulling="2026-03-08 00:20:00.744219521 +0000 UTC m=+854.863851754" lastFinishedPulling="2026-03-08 00:20:03.272336415 +0000 UTC m=+857.391968648" observedRunningTime="2026-03-08 00:20:06.87417442 +0000 UTC m=+860.993806663" watchObservedRunningTime="2026-03-08 00:20:06.893573559 +0000 UTC m=+861.013205792" Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.927538 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" podStartSLOduration=5.407772094 podStartE2EDuration="6.927517799s" podCreationTimestamp="2026-03-08 00:20:00 +0000 UTC" firstStartedPulling="2026-03-08 00:20:01.751359357 +0000 UTC m=+855.870991600" lastFinishedPulling="2026-03-08 00:20:03.271105072 +0000 UTC m=+857.390737305" observedRunningTime="2026-03-08 00:20:06.923960716 +0000 UTC m=+861.043592949" watchObservedRunningTime="2026-03-08 00:20:06.927517799 +0000 UTC m=+861.047150022" Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.935503 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:20:07 crc kubenswrapper[4713]: I0308 00:20:07.365353 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29548814-v94cz"] Mar 08 00:20:07 crc kubenswrapper[4713]: I0308 00:20:07.373957 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29548814-v94cz"] Mar 08 00:20:07 crc kubenswrapper[4713]: I0308 00:20:07.854940 4713 generic.go:334] "Generic (PLEG): container finished" podID="54dbca74-9530-4327-8ede-124dc50096cf" containerID="40e345edc7e613ecc12357e373e3bd98dff211d5631b1d57b9dcc6475a9fad5f" exitCode=0 Mar 08 00:20:07 crc kubenswrapper[4713]: I0308 00:20:07.855254 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" event={"ID":"54dbca74-9530-4327-8ede-124dc50096cf","Type":"ContainerDied","Data":"40e345edc7e613ecc12357e373e3bd98dff211d5631b1d57b9dcc6475a9fad5f"} Mar 08 00:20:07 crc kubenswrapper[4713]: I0308 00:20:07.856989 4713 generic.go:334] "Generic (PLEG): container finished" podID="82947b22-2505-49f0-94e0-039a1a219656" containerID="8eeddca99c72da75088d7692b5518a91502014098929856d7dd903dc4f2249d0" exitCode=0 Mar 08 00:20:07 crc kubenswrapper[4713]: I0308 00:20:07.857022 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" event={"ID":"82947b22-2505-49f0-94e0-039a1a219656","Type":"ContainerDied","Data":"8eeddca99c72da75088d7692b5518a91502014098929856d7dd903dc4f2249d0"} Mar 08 00:20:07 crc kubenswrapper[4713]: I0308 00:20:07.859696 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75hx9" event={"ID":"d36584b2-9533-4c0e-807f-247e1dbfde71","Type":"ContainerStarted","Data":"203803ad97a614301bd797ddfaef477a72b58ad751b3d2f33a3a8397a7ce8390"} Mar 08 00:20:08 crc kubenswrapper[4713]: I0308 00:20:08.429722 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw"] Mar 08 00:20:08 crc kubenswrapper[4713]: E0308 00:20:08.429937 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c62a3d3-0f8a-40d6-a2f0-b860e9c85085" containerName="oc" Mar 08 00:20:08 crc kubenswrapper[4713]: I0308 00:20:08.429949 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c62a3d3-0f8a-40d6-a2f0-b860e9c85085" containerName="oc" Mar 08 00:20:08 crc kubenswrapper[4713]: I0308 00:20:08.430086 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c62a3d3-0f8a-40d6-a2f0-b860e9c85085" containerName="oc" Mar 08 00:20:08 crc kubenswrapper[4713]: I0308 00:20:08.430759 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" Mar 08 00:20:08 crc kubenswrapper[4713]: I0308 00:20:08.498058 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw"] Mar 08 00:20:08 crc kubenswrapper[4713]: I0308 00:20:08.547387 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a8563b5-1794-4b14-b040-5694cafd63e8" path="/var/lib/kubelet/pods/4a8563b5-1794-4b14-b040-5694cafd63e8/volumes" Mar 08 00:20:08 crc kubenswrapper[4713]: I0308 00:20:08.551967 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw\" (UID: \"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" Mar 08 00:20:08 crc kubenswrapper[4713]: I0308 00:20:08.552050 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2546\" (UniqueName: \"kubernetes.io/projected/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-kube-api-access-t2546\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw\" (UID: \"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" Mar 08 00:20:08 crc kubenswrapper[4713]: I0308 00:20:08.552127 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw\" (UID: \"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" Mar 08 00:20:08 crc kubenswrapper[4713]: I0308 00:20:08.653301 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw\" (UID: \"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" Mar 08 00:20:08 crc kubenswrapper[4713]: I0308 00:20:08.653358 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2546\" (UniqueName: \"kubernetes.io/projected/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-kube-api-access-t2546\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw\" (UID: \"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" Mar 08 00:20:08 crc kubenswrapper[4713]: I0308 00:20:08.653435 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw\" (UID: \"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" Mar 08 00:20:08 crc kubenswrapper[4713]: I0308 00:20:08.653846 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw\" (UID: \"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" Mar 08 00:20:08 crc kubenswrapper[4713]: I0308 00:20:08.654559 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw\" (UID: \"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" Mar 08 00:20:08 crc kubenswrapper[4713]: I0308 00:20:08.689856 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2546\" (UniqueName: \"kubernetes.io/projected/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-kube-api-access-t2546\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw\" (UID: \"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" Mar 08 00:20:08 crc kubenswrapper[4713]: I0308 00:20:08.764190 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.441540 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.450404 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.571130 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gqdk\" (UniqueName: \"kubernetes.io/projected/82947b22-2505-49f0-94e0-039a1a219656-kube-api-access-9gqdk\") pod \"82947b22-2505-49f0-94e0-039a1a219656\" (UID: \"82947b22-2505-49f0-94e0-039a1a219656\") " Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.571218 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54dbca74-9530-4327-8ede-124dc50096cf-util\") pod \"54dbca74-9530-4327-8ede-124dc50096cf\" (UID: \"54dbca74-9530-4327-8ede-124dc50096cf\") " Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.571239 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82947b22-2505-49f0-94e0-039a1a219656-util\") pod \"82947b22-2505-49f0-94e0-039a1a219656\" (UID: \"82947b22-2505-49f0-94e0-039a1a219656\") " Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.571261 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82947b22-2505-49f0-94e0-039a1a219656-bundle\") pod \"82947b22-2505-49f0-94e0-039a1a219656\" (UID: \"82947b22-2505-49f0-94e0-039a1a219656\") " Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.571304 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncqrm\" (UniqueName: \"kubernetes.io/projected/54dbca74-9530-4327-8ede-124dc50096cf-kube-api-access-ncqrm\") pod \"54dbca74-9530-4327-8ede-124dc50096cf\" (UID: \"54dbca74-9530-4327-8ede-124dc50096cf\") " Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.571340 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54dbca74-9530-4327-8ede-124dc50096cf-bundle\") pod \"54dbca74-9530-4327-8ede-124dc50096cf\" (UID: \"54dbca74-9530-4327-8ede-124dc50096cf\") " Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.572158 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54dbca74-9530-4327-8ede-124dc50096cf-bundle" (OuterVolumeSpecName: "bundle") pod "54dbca74-9530-4327-8ede-124dc50096cf" (UID: "54dbca74-9530-4327-8ede-124dc50096cf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.573354 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82947b22-2505-49f0-94e0-039a1a219656-bundle" (OuterVolumeSpecName: "bundle") pod "82947b22-2505-49f0-94e0-039a1a219656" (UID: "82947b22-2505-49f0-94e0-039a1a219656"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.579273 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54dbca74-9530-4327-8ede-124dc50096cf-kube-api-access-ncqrm" (OuterVolumeSpecName: "kube-api-access-ncqrm") pod "54dbca74-9530-4327-8ede-124dc50096cf" (UID: "54dbca74-9530-4327-8ede-124dc50096cf"). InnerVolumeSpecName "kube-api-access-ncqrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.581268 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82947b22-2505-49f0-94e0-039a1a219656-kube-api-access-9gqdk" (OuterVolumeSpecName: "kube-api-access-9gqdk") pod "82947b22-2505-49f0-94e0-039a1a219656" (UID: "82947b22-2505-49f0-94e0-039a1a219656"). InnerVolumeSpecName "kube-api-access-9gqdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.591957 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54dbca74-9530-4327-8ede-124dc50096cf-util" (OuterVolumeSpecName: "util") pod "54dbca74-9530-4327-8ede-124dc50096cf" (UID: "54dbca74-9530-4327-8ede-124dc50096cf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.593682 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82947b22-2505-49f0-94e0-039a1a219656-util" (OuterVolumeSpecName: "util") pod "82947b22-2505-49f0-94e0-039a1a219656" (UID: "82947b22-2505-49f0-94e0-039a1a219656"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.616568 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw"] Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.672406 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gqdk\" (UniqueName: \"kubernetes.io/projected/82947b22-2505-49f0-94e0-039a1a219656-kube-api-access-9gqdk\") on node \"crc\" DevicePath \"\"" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.672447 4713 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54dbca74-9530-4327-8ede-124dc50096cf-util\") on node \"crc\" DevicePath \"\"" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.672460 4713 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82947b22-2505-49f0-94e0-039a1a219656-util\") on node \"crc\" DevicePath \"\"" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.672473 4713 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82947b22-2505-49f0-94e0-039a1a219656-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.672485 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncqrm\" (UniqueName: \"kubernetes.io/projected/54dbca74-9530-4327-8ede-124dc50096cf-kube-api-access-ncqrm\") on node \"crc\" DevicePath \"\"" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.672498 4713 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54dbca74-9530-4327-8ede-124dc50096cf-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.876234 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" event={"ID":"54dbca74-9530-4327-8ede-124dc50096cf","Type":"ContainerDied","Data":"808e9e7480a81eee4107c02dd7bdc5469952f29631ae9e10215a3f95deab1629"} Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.876560 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="808e9e7480a81eee4107c02dd7bdc5469952f29631ae9e10215a3f95deab1629" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.876295 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.878133 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" event={"ID":"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2","Type":"ContainerStarted","Data":"fcd5a63406e47a9ca5e740a3b76dadd13920b5c3ffc7dd0be1ebb90e3737ab3a"} Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.878172 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" event={"ID":"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2","Type":"ContainerStarted","Data":"8ba57064076cfea14f3b28a190f2d539ac83115e86c3be26c27521876412cfae"} Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.880441 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" event={"ID":"82947b22-2505-49f0-94e0-039a1a219656","Type":"ContainerDied","Data":"5863e3fb49cb0abd48c8e4b772dd331e5b10c077db31a598dfa94396300dc6da"} Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.880491 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5863e3fb49cb0abd48c8e4b772dd331e5b10c077db31a598dfa94396300dc6da" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.880491 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.882568 4713 generic.go:334] "Generic (PLEG): container finished" podID="d36584b2-9533-4c0e-807f-247e1dbfde71" containerID="203803ad97a614301bd797ddfaef477a72b58ad751b3d2f33a3a8397a7ce8390" exitCode=0 Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.882619 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75hx9" event={"ID":"d36584b2-9533-4c0e-807f-247e1dbfde71","Type":"ContainerDied","Data":"203803ad97a614301bd797ddfaef477a72b58ad751b3d2f33a3a8397a7ce8390"} Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.878714 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-4z5hw"] Mar 08 00:20:10 crc kubenswrapper[4713]: E0308 00:20:10.879388 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54dbca74-9530-4327-8ede-124dc50096cf" containerName="pull" Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.879410 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="54dbca74-9530-4327-8ede-124dc50096cf" containerName="pull" Mar 08 00:20:10 crc kubenswrapper[4713]: E0308 00:20:10.879432 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82947b22-2505-49f0-94e0-039a1a219656" containerName="pull" Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.879440 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="82947b22-2505-49f0-94e0-039a1a219656" containerName="pull" Mar 08 00:20:10 crc kubenswrapper[4713]: E0308 00:20:10.879450 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54dbca74-9530-4327-8ede-124dc50096cf" containerName="extract" Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.879457 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="54dbca74-9530-4327-8ede-124dc50096cf" containerName="extract" Mar 08 00:20:10 crc kubenswrapper[4713]: E0308 00:20:10.879470 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82947b22-2505-49f0-94e0-039a1a219656" containerName="util" Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.879476 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="82947b22-2505-49f0-94e0-039a1a219656" containerName="util" Mar 08 00:20:10 crc kubenswrapper[4713]: E0308 00:20:10.879485 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82947b22-2505-49f0-94e0-039a1a219656" containerName="extract" Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.879492 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="82947b22-2505-49f0-94e0-039a1a219656" containerName="extract" Mar 08 00:20:10 crc kubenswrapper[4713]: E0308 00:20:10.879502 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54dbca74-9530-4327-8ede-124dc50096cf" containerName="util" Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.879507 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="54dbca74-9530-4327-8ede-124dc50096cf" containerName="util" Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.879598 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="54dbca74-9530-4327-8ede-124dc50096cf" containerName="extract" Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.879612 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="82947b22-2505-49f0-94e0-039a1a219656" containerName="extract" Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.880145 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-4z5hw" Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.881744 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.882131 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-hxq7b" Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.882254 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.889347 4713 generic.go:334] "Generic (PLEG): container finished" podID="f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2" containerID="fcd5a63406e47a9ca5e740a3b76dadd13920b5c3ffc7dd0be1ebb90e3737ab3a" exitCode=0 Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.889409 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" event={"ID":"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2","Type":"ContainerDied","Data":"fcd5a63406e47a9ca5e740a3b76dadd13920b5c3ffc7dd0be1ebb90e3737ab3a"} Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.889639 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-4z5hw"] Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.892672 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75hx9" event={"ID":"d36584b2-9533-4c0e-807f-247e1dbfde71","Type":"ContainerStarted","Data":"44ec152dd3b5386afab48ac8b39a7d3e0f2f0d40c6f319d2c38fe0147e42cf11"} Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.989017 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nvbs\" (UniqueName: \"kubernetes.io/projected/1f48c701-2464-42f6-b2d7-c851ae965f1b-kube-api-access-6nvbs\") pod \"obo-prometheus-operator-68bc856cb9-4z5hw\" (UID: \"1f48c701-2464-42f6-b2d7-c851ae965f1b\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-4z5hw" Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.993544 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-75hx9" podStartSLOduration=3.353857585 podStartE2EDuration="6.993521132s" podCreationTimestamp="2026-03-08 00:20:04 +0000 UTC" firstStartedPulling="2026-03-08 00:20:06.840454015 +0000 UTC m=+860.960086248" lastFinishedPulling="2026-03-08 00:20:10.480117562 +0000 UTC m=+864.599749795" observedRunningTime="2026-03-08 00:20:10.983934281 +0000 UTC m=+865.103566534" watchObservedRunningTime="2026-03-08 00:20:10.993521132 +0000 UTC m=+865.113153365" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.039792 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5"] Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.040612 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.044284 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.045081 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-qsrk4" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.051721 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5"] Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.063051 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk"] Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.063902 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.087606 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk"] Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.091019 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nvbs\" (UniqueName: \"kubernetes.io/projected/1f48c701-2464-42f6-b2d7-c851ae965f1b-kube-api-access-6nvbs\") pod \"obo-prometheus-operator-68bc856cb9-4z5hw\" (UID: \"1f48c701-2464-42f6-b2d7-c851ae965f1b\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-4z5hw" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.131876 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nvbs\" (UniqueName: \"kubernetes.io/projected/1f48c701-2464-42f6-b2d7-c851ae965f1b-kube-api-access-6nvbs\") pod \"obo-prometheus-operator-68bc856cb9-4z5hw\" (UID: \"1f48c701-2464-42f6-b2d7-c851ae965f1b\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-4z5hw" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.192193 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/860dc604-80d3-4d4b-8b1e-8a430b706882-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5\" (UID: \"860dc604-80d3-4d4b-8b1e-8a430b706882\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.192250 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/860dc604-80d3-4d4b-8b1e-8a430b706882-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5\" (UID: \"860dc604-80d3-4d4b-8b1e-8a430b706882\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.192366 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e2152c14-6da7-4f74-a30e-da9e4e7c1acc-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk\" (UID: \"e2152c14-6da7-4f74-a30e-da9e4e7c1acc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.192421 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e2152c14-6da7-4f74-a30e-da9e4e7c1acc-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk\" (UID: \"e2152c14-6da7-4f74-a30e-da9e4e7c1acc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.197105 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-4z5hw" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.201722 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-v4h4x"] Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.202726 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-v4h4x" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.206339 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.206615 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-dswwm" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.246805 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z6sch"] Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.247045 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z6sch" podUID="deebc8d8-7e37-468b-a3b9-4ef9e73afb7a" containerName="registry-server" containerID="cri-o://6033e578d21856f494ab38ca348bc0a1d9f9267385dd514c6d3a55d74ab8847e" gracePeriod=2 Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.264779 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-v4h4x"] Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.293925 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/f559f6d0-89dc-4d38-807f-491671408dc7-observability-operator-tls\") pod \"observability-operator-59bdc8b94-v4h4x\" (UID: \"f559f6d0-89dc-4d38-807f-491671408dc7\") " pod="openshift-operators/observability-operator-59bdc8b94-v4h4x" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.293974 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e2152c14-6da7-4f74-a30e-da9e4e7c1acc-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk\" (UID: \"e2152c14-6da7-4f74-a30e-da9e4e7c1acc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.294016 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e2152c14-6da7-4f74-a30e-da9e4e7c1acc-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk\" (UID: \"e2152c14-6da7-4f74-a30e-da9e4e7c1acc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.294056 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/860dc604-80d3-4d4b-8b1e-8a430b706882-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5\" (UID: \"860dc604-80d3-4d4b-8b1e-8a430b706882\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.294075 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dtm6\" (UniqueName: \"kubernetes.io/projected/f559f6d0-89dc-4d38-807f-491671408dc7-kube-api-access-2dtm6\") pod \"observability-operator-59bdc8b94-v4h4x\" (UID: \"f559f6d0-89dc-4d38-807f-491671408dc7\") " pod="openshift-operators/observability-operator-59bdc8b94-v4h4x" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.294101 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/860dc604-80d3-4d4b-8b1e-8a430b706882-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5\" (UID: \"860dc604-80d3-4d4b-8b1e-8a430b706882\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.297578 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e2152c14-6da7-4f74-a30e-da9e4e7c1acc-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk\" (UID: \"e2152c14-6da7-4f74-a30e-da9e4e7c1acc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.301405 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e2152c14-6da7-4f74-a30e-da9e4e7c1acc-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk\" (UID: \"e2152c14-6da7-4f74-a30e-da9e4e7c1acc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.301405 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/860dc604-80d3-4d4b-8b1e-8a430b706882-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5\" (UID: \"860dc604-80d3-4d4b-8b1e-8a430b706882\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.312587 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/860dc604-80d3-4d4b-8b1e-8a430b706882-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5\" (UID: \"860dc604-80d3-4d4b-8b1e-8a430b706882\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.359093 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.386592 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.394784 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/f559f6d0-89dc-4d38-807f-491671408dc7-observability-operator-tls\") pod \"observability-operator-59bdc8b94-v4h4x\" (UID: \"f559f6d0-89dc-4d38-807f-491671408dc7\") " pod="openshift-operators/observability-operator-59bdc8b94-v4h4x" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.394890 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dtm6\" (UniqueName: \"kubernetes.io/projected/f559f6d0-89dc-4d38-807f-491671408dc7-kube-api-access-2dtm6\") pod \"observability-operator-59bdc8b94-v4h4x\" (UID: \"f559f6d0-89dc-4d38-807f-491671408dc7\") " pod="openshift-operators/observability-operator-59bdc8b94-v4h4x" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.401804 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/f559f6d0-89dc-4d38-807f-491671408dc7-observability-operator-tls\") pod \"observability-operator-59bdc8b94-v4h4x\" (UID: \"f559f6d0-89dc-4d38-807f-491671408dc7\") " pod="openshift-operators/observability-operator-59bdc8b94-v4h4x" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.414758 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dtm6\" (UniqueName: \"kubernetes.io/projected/f559f6d0-89dc-4d38-807f-491671408dc7-kube-api-access-2dtm6\") pod \"observability-operator-59bdc8b94-v4h4x\" (UID: \"f559f6d0-89dc-4d38-807f-491671408dc7\") " pod="openshift-operators/observability-operator-59bdc8b94-v4h4x" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.438627 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-tw72p"] Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.441995 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-tw72p" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.444213 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-vhpxz" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.450655 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-tw72p"] Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.502396 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-4z5hw"] Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.566595 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-v4h4x" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.599084 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/3d1a0596-7485-4376-9630-688753a7abd7-openshift-service-ca\") pod \"perses-operator-5bf474d74f-tw72p\" (UID: \"3d1a0596-7485-4376-9630-688753a7abd7\") " pod="openshift-operators/perses-operator-5bf474d74f-tw72p" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.599152 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m68qd\" (UniqueName: \"kubernetes.io/projected/3d1a0596-7485-4376-9630-688753a7abd7-kube-api-access-m68qd\") pod \"perses-operator-5bf474d74f-tw72p\" (UID: \"3d1a0596-7485-4376-9630-688753a7abd7\") " pod="openshift-operators/perses-operator-5bf474d74f-tw72p" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.699838 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/3d1a0596-7485-4376-9630-688753a7abd7-openshift-service-ca\") pod \"perses-operator-5bf474d74f-tw72p\" (UID: \"3d1a0596-7485-4376-9630-688753a7abd7\") " pod="openshift-operators/perses-operator-5bf474d74f-tw72p" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.700185 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m68qd\" (UniqueName: \"kubernetes.io/projected/3d1a0596-7485-4376-9630-688753a7abd7-kube-api-access-m68qd\") pod \"perses-operator-5bf474d74f-tw72p\" (UID: \"3d1a0596-7485-4376-9630-688753a7abd7\") " pod="openshift-operators/perses-operator-5bf474d74f-tw72p" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.701534 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/3d1a0596-7485-4376-9630-688753a7abd7-openshift-service-ca\") pod \"perses-operator-5bf474d74f-tw72p\" (UID: \"3d1a0596-7485-4376-9630-688753a7abd7\") " pod="openshift-operators/perses-operator-5bf474d74f-tw72p" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.718206 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.727803 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m68qd\" (UniqueName: \"kubernetes.io/projected/3d1a0596-7485-4376-9630-688753a7abd7-kube-api-access-m68qd\") pod \"perses-operator-5bf474d74f-tw72p\" (UID: \"3d1a0596-7485-4376-9630-688753a7abd7\") " pod="openshift-operators/perses-operator-5bf474d74f-tw72p" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.769279 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk"] Mar 08 00:20:11 crc kubenswrapper[4713]: W0308 00:20:11.779684 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2152c14_6da7_4f74_a30e_da9e4e7c1acc.slice/crio-9459eb0cccba4853fe570acaa118e50e3638489bf24957a5208bce1321878180 WatchSource:0}: Error finding container 9459eb0cccba4853fe570acaa118e50e3638489bf24957a5208bce1321878180: Status 404 returned error can't find the container with id 9459eb0cccba4853fe570acaa118e50e3638489bf24957a5208bce1321878180 Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.789023 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-tw72p" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.800881 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-utilities\") pod \"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a\" (UID: \"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a\") " Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.800970 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-catalog-content\") pod \"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a\" (UID: \"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a\") " Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.801015 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2btx\" (UniqueName: \"kubernetes.io/projected/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-kube-api-access-f2btx\") pod \"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a\" (UID: \"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a\") " Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.805405 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-kube-api-access-f2btx" (OuterVolumeSpecName: "kube-api-access-f2btx") pod "deebc8d8-7e37-468b-a3b9-4ef9e73afb7a" (UID: "deebc8d8-7e37-468b-a3b9-4ef9e73afb7a"). InnerVolumeSpecName "kube-api-access-f2btx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.820435 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-utilities" (OuterVolumeSpecName: "utilities") pod "deebc8d8-7e37-468b-a3b9-4ef9e73afb7a" (UID: "deebc8d8-7e37-468b-a3b9-4ef9e73afb7a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.893542 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5"] Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.900978 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-v4h4x"] Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.903601 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.903633 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2btx\" (UniqueName: \"kubernetes.io/projected/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-kube-api-access-f2btx\") on node \"crc\" DevicePath \"\"" Mar 08 00:20:11 crc kubenswrapper[4713]: W0308 00:20:11.910938 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod860dc604_80d3_4d4b_8b1e_8a430b706882.slice/crio-a0f06089fb9523f49bd2ed80109c3c6a23905db25659ad2b7227c376283dc8dd WatchSource:0}: Error finding container a0f06089fb9523f49bd2ed80109c3c6a23905db25659ad2b7227c376283dc8dd: Status 404 returned error can't find the container with id a0f06089fb9523f49bd2ed80109c3c6a23905db25659ad2b7227c376283dc8dd Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.915777 4713 generic.go:334] "Generic (PLEG): container finished" podID="deebc8d8-7e37-468b-a3b9-4ef9e73afb7a" containerID="6033e578d21856f494ab38ca348bc0a1d9f9267385dd514c6d3a55d74ab8847e" exitCode=0 Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.915902 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6sch" event={"ID":"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a","Type":"ContainerDied","Data":"6033e578d21856f494ab38ca348bc0a1d9f9267385dd514c6d3a55d74ab8847e"} Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.915934 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6sch" event={"ID":"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a","Type":"ContainerDied","Data":"3f5dc039938ae0039619e3673f0d3e74ed91954352f20a12f6e9005ffaa413a3"} Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.915957 4713 scope.go:117] "RemoveContainer" containerID="6033e578d21856f494ab38ca348bc0a1d9f9267385dd514c6d3a55d74ab8847e" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.916143 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.933583 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk" event={"ID":"e2152c14-6da7-4f74-a30e-da9e4e7c1acc","Type":"ContainerStarted","Data":"9459eb0cccba4853fe570acaa118e50e3638489bf24957a5208bce1321878180"} Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.939043 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-4z5hw" event={"ID":"1f48c701-2464-42f6-b2d7-c851ae965f1b","Type":"ContainerStarted","Data":"a399e931a76cf9a50eb6862305dddec7ffe6e8c7ec95be07abefa57e3108aaf6"} Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.954635 4713 scope.go:117] "RemoveContainer" containerID="8bac1d74838606ee4bfa04c4b9838c6c0bf83c1ac059419f147c5375fea2a1d8" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.955008 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "deebc8d8-7e37-468b-a3b9-4ef9e73afb7a" (UID: "deebc8d8-7e37-468b-a3b9-4ef9e73afb7a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.986907 4713 scope.go:117] "RemoveContainer" containerID="484e97f172ed4466c9f0c5c9bef702dc82ce8b64ec4b2a02f887d02e4cd3c361" Mar 08 00:20:12 crc kubenswrapper[4713]: I0308 00:20:12.004368 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:20:12 crc kubenswrapper[4713]: I0308 00:20:12.023002 4713 scope.go:117] "RemoveContainer" containerID="6033e578d21856f494ab38ca348bc0a1d9f9267385dd514c6d3a55d74ab8847e" Mar 08 00:20:12 crc kubenswrapper[4713]: E0308 00:20:12.024144 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6033e578d21856f494ab38ca348bc0a1d9f9267385dd514c6d3a55d74ab8847e\": container with ID starting with 6033e578d21856f494ab38ca348bc0a1d9f9267385dd514c6d3a55d74ab8847e not found: ID does not exist" containerID="6033e578d21856f494ab38ca348bc0a1d9f9267385dd514c6d3a55d74ab8847e" Mar 08 00:20:12 crc kubenswrapper[4713]: I0308 00:20:12.024179 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6033e578d21856f494ab38ca348bc0a1d9f9267385dd514c6d3a55d74ab8847e"} err="failed to get container status \"6033e578d21856f494ab38ca348bc0a1d9f9267385dd514c6d3a55d74ab8847e\": rpc error: code = NotFound desc = could not find container \"6033e578d21856f494ab38ca348bc0a1d9f9267385dd514c6d3a55d74ab8847e\": container with ID starting with 6033e578d21856f494ab38ca348bc0a1d9f9267385dd514c6d3a55d74ab8847e not found: ID does not exist" Mar 08 00:20:12 crc kubenswrapper[4713]: I0308 00:20:12.024204 4713 scope.go:117] "RemoveContainer" containerID="8bac1d74838606ee4bfa04c4b9838c6c0bf83c1ac059419f147c5375fea2a1d8" Mar 08 00:20:12 crc kubenswrapper[4713]: E0308 00:20:12.024653 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bac1d74838606ee4bfa04c4b9838c6c0bf83c1ac059419f147c5375fea2a1d8\": container with ID starting with 8bac1d74838606ee4bfa04c4b9838c6c0bf83c1ac059419f147c5375fea2a1d8 not found: ID does not exist" containerID="8bac1d74838606ee4bfa04c4b9838c6c0bf83c1ac059419f147c5375fea2a1d8" Mar 08 00:20:12 crc kubenswrapper[4713]: I0308 00:20:12.024678 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bac1d74838606ee4bfa04c4b9838c6c0bf83c1ac059419f147c5375fea2a1d8"} err="failed to get container status \"8bac1d74838606ee4bfa04c4b9838c6c0bf83c1ac059419f147c5375fea2a1d8\": rpc error: code = NotFound desc = could not find container \"8bac1d74838606ee4bfa04c4b9838c6c0bf83c1ac059419f147c5375fea2a1d8\": container with ID starting with 8bac1d74838606ee4bfa04c4b9838c6c0bf83c1ac059419f147c5375fea2a1d8 not found: ID does not exist" Mar 08 00:20:12 crc kubenswrapper[4713]: I0308 00:20:12.024695 4713 scope.go:117] "RemoveContainer" containerID="484e97f172ed4466c9f0c5c9bef702dc82ce8b64ec4b2a02f887d02e4cd3c361" Mar 08 00:20:12 crc kubenswrapper[4713]: E0308 00:20:12.024960 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"484e97f172ed4466c9f0c5c9bef702dc82ce8b64ec4b2a02f887d02e4cd3c361\": container with ID starting with 484e97f172ed4466c9f0c5c9bef702dc82ce8b64ec4b2a02f887d02e4cd3c361 not found: ID does not exist" containerID="484e97f172ed4466c9f0c5c9bef702dc82ce8b64ec4b2a02f887d02e4cd3c361" Mar 08 00:20:12 crc kubenswrapper[4713]: I0308 00:20:12.024980 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"484e97f172ed4466c9f0c5c9bef702dc82ce8b64ec4b2a02f887d02e4cd3c361"} err="failed to get container status \"484e97f172ed4466c9f0c5c9bef702dc82ce8b64ec4b2a02f887d02e4cd3c361\": rpc error: code = NotFound desc = could not find container \"484e97f172ed4466c9f0c5c9bef702dc82ce8b64ec4b2a02f887d02e4cd3c361\": container with ID starting with 484e97f172ed4466c9f0c5c9bef702dc82ce8b64ec4b2a02f887d02e4cd3c361 not found: ID does not exist" Mar 08 00:20:12 crc kubenswrapper[4713]: I0308 00:20:12.072058 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-tw72p"] Mar 08 00:20:12 crc kubenswrapper[4713]: W0308 00:20:12.089479 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d1a0596_7485_4376_9630_688753a7abd7.slice/crio-98d4a9fe55a0de57d2470fcfc5445686d40184de1623edfb37b1198353c571a0 WatchSource:0}: Error finding container 98d4a9fe55a0de57d2470fcfc5445686d40184de1623edfb37b1198353c571a0: Status 404 returned error can't find the container with id 98d4a9fe55a0de57d2470fcfc5445686d40184de1623edfb37b1198353c571a0 Mar 08 00:20:12 crc kubenswrapper[4713]: I0308 00:20:12.248271 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z6sch"] Mar 08 00:20:12 crc kubenswrapper[4713]: I0308 00:20:12.252685 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z6sch"] Mar 08 00:20:12 crc kubenswrapper[4713]: I0308 00:20:12.565250 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deebc8d8-7e37-468b-a3b9-4ef9e73afb7a" path="/var/lib/kubelet/pods/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a/volumes" Mar 08 00:20:12 crc kubenswrapper[4713]: I0308 00:20:12.947194 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-tw72p" event={"ID":"3d1a0596-7485-4376-9630-688753a7abd7","Type":"ContainerStarted","Data":"98d4a9fe55a0de57d2470fcfc5445686d40184de1623edfb37b1198353c571a0"} Mar 08 00:20:12 crc kubenswrapper[4713]: I0308 00:20:12.953109 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5" event={"ID":"860dc604-80d3-4d4b-8b1e-8a430b706882","Type":"ContainerStarted","Data":"a0f06089fb9523f49bd2ed80109c3c6a23905db25659ad2b7227c376283dc8dd"} Mar 08 00:20:12 crc kubenswrapper[4713]: I0308 00:20:12.954334 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-v4h4x" event={"ID":"f559f6d0-89dc-4d38-807f-491671408dc7","Type":"ContainerStarted","Data":"e8f02f5581939087fbccea10db90ae57c616fe33ee236dcbc9c2e6165ecfd6ff"} Mar 08 00:20:12 crc kubenswrapper[4713]: I0308 00:20:12.963990 4713 scope.go:117] "RemoveContainer" containerID="dfa43747f3bb6e5dbf06700a034e142c0a3b9f2938aaade963ddcb6f4fd3fb53" Mar 08 00:20:15 crc kubenswrapper[4713]: I0308 00:20:15.257603 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:15 crc kubenswrapper[4713]: I0308 00:20:15.258732 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:15 crc kubenswrapper[4713]: I0308 00:20:15.360537 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.081148 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.564472 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elastic-operator-59b484cb78-hfzmx"] Mar 08 00:20:16 crc kubenswrapper[4713]: E0308 00:20:16.564743 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deebc8d8-7e37-468b-a3b9-4ef9e73afb7a" containerName="extract-content" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.564758 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="deebc8d8-7e37-468b-a3b9-4ef9e73afb7a" containerName="extract-content" Mar 08 00:20:16 crc kubenswrapper[4713]: E0308 00:20:16.564781 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deebc8d8-7e37-468b-a3b9-4ef9e73afb7a" containerName="registry-server" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.564789 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="deebc8d8-7e37-468b-a3b9-4ef9e73afb7a" containerName="registry-server" Mar 08 00:20:16 crc kubenswrapper[4713]: E0308 00:20:16.564807 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deebc8d8-7e37-468b-a3b9-4ef9e73afb7a" containerName="extract-utilities" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.564816 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="deebc8d8-7e37-468b-a3b9-4ef9e73afb7a" containerName="extract-utilities" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.564955 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="deebc8d8-7e37-468b-a3b9-4ef9e73afb7a" containerName="registry-server" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.565436 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-59b484cb78-hfzmx" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.573878 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"kube-root-ca.crt" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.574005 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"openshift-service-ca.crt" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.574076 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-service-cert" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.574029 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-dockercfg-f4ckr" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.581699 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-59b484cb78-hfzmx"] Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.675950 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e5a74652-f05c-47a0-8caa-77f544c95128-apiservice-cert\") pod \"elastic-operator-59b484cb78-hfzmx\" (UID: \"e5a74652-f05c-47a0-8caa-77f544c95128\") " pod="service-telemetry/elastic-operator-59b484cb78-hfzmx" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.675986 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxz9t\" (UniqueName: \"kubernetes.io/projected/e5a74652-f05c-47a0-8caa-77f544c95128-kube-api-access-qxz9t\") pod \"elastic-operator-59b484cb78-hfzmx\" (UID: \"e5a74652-f05c-47a0-8caa-77f544c95128\") " pod="service-telemetry/elastic-operator-59b484cb78-hfzmx" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.676053 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e5a74652-f05c-47a0-8caa-77f544c95128-webhook-cert\") pod \"elastic-operator-59b484cb78-hfzmx\" (UID: \"e5a74652-f05c-47a0-8caa-77f544c95128\") " pod="service-telemetry/elastic-operator-59b484cb78-hfzmx" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.777178 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e5a74652-f05c-47a0-8caa-77f544c95128-webhook-cert\") pod \"elastic-operator-59b484cb78-hfzmx\" (UID: \"e5a74652-f05c-47a0-8caa-77f544c95128\") " pod="service-telemetry/elastic-operator-59b484cb78-hfzmx" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.777305 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e5a74652-f05c-47a0-8caa-77f544c95128-apiservice-cert\") pod \"elastic-operator-59b484cb78-hfzmx\" (UID: \"e5a74652-f05c-47a0-8caa-77f544c95128\") " pod="service-telemetry/elastic-operator-59b484cb78-hfzmx" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.777333 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxz9t\" (UniqueName: \"kubernetes.io/projected/e5a74652-f05c-47a0-8caa-77f544c95128-kube-api-access-qxz9t\") pod \"elastic-operator-59b484cb78-hfzmx\" (UID: \"e5a74652-f05c-47a0-8caa-77f544c95128\") " pod="service-telemetry/elastic-operator-59b484cb78-hfzmx" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.786261 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e5a74652-f05c-47a0-8caa-77f544c95128-apiservice-cert\") pod \"elastic-operator-59b484cb78-hfzmx\" (UID: \"e5a74652-f05c-47a0-8caa-77f544c95128\") " pod="service-telemetry/elastic-operator-59b484cb78-hfzmx" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.797909 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e5a74652-f05c-47a0-8caa-77f544c95128-webhook-cert\") pod \"elastic-operator-59b484cb78-hfzmx\" (UID: \"e5a74652-f05c-47a0-8caa-77f544c95128\") " pod="service-telemetry/elastic-operator-59b484cb78-hfzmx" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.837246 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxz9t\" (UniqueName: \"kubernetes.io/projected/e5a74652-f05c-47a0-8caa-77f544c95128-kube-api-access-qxz9t\") pod \"elastic-operator-59b484cb78-hfzmx\" (UID: \"e5a74652-f05c-47a0-8caa-77f544c95128\") " pod="service-telemetry/elastic-operator-59b484cb78-hfzmx" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.891813 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-59b484cb78-hfzmx" Mar 08 00:20:17 crc kubenswrapper[4713]: I0308 00:20:17.032739 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-75hx9"] Mar 08 00:20:19 crc kubenswrapper[4713]: I0308 00:20:19.008259 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-75hx9" podUID="d36584b2-9533-4c0e-807f-247e1dbfde71" containerName="registry-server" containerID="cri-o://44ec152dd3b5386afab48ac8b39a7d3e0f2f0d40c6f319d2c38fe0147e42cf11" gracePeriod=2 Mar 08 00:20:20 crc kubenswrapper[4713]: I0308 00:20:20.017176 4713 generic.go:334] "Generic (PLEG): container finished" podID="d36584b2-9533-4c0e-807f-247e1dbfde71" containerID="44ec152dd3b5386afab48ac8b39a7d3e0f2f0d40c6f319d2c38fe0147e42cf11" exitCode=0 Mar 08 00:20:20 crc kubenswrapper[4713]: I0308 00:20:20.017255 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75hx9" event={"ID":"d36584b2-9533-4c0e-807f-247e1dbfde71","Type":"ContainerDied","Data":"44ec152dd3b5386afab48ac8b39a7d3e0f2f0d40c6f319d2c38fe0147e42cf11"} Mar 08 00:20:21 crc kubenswrapper[4713]: I0308 00:20:21.010027 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-qt8kz"] Mar 08 00:20:21 crc kubenswrapper[4713]: I0308 00:20:21.010717 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-qt8kz" Mar 08 00:20:21 crc kubenswrapper[4713]: I0308 00:20:21.012465 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"interconnect-operator-dockercfg-pcr9h" Mar 08 00:20:21 crc kubenswrapper[4713]: I0308 00:20:21.020743 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-qt8kz"] Mar 08 00:20:21 crc kubenswrapper[4713]: I0308 00:20:21.150860 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s52x9\" (UniqueName: \"kubernetes.io/projected/37b64282-4957-4a04-b1be-6d3184bfdd25-kube-api-access-s52x9\") pod \"interconnect-operator-5bb49f789d-qt8kz\" (UID: \"37b64282-4957-4a04-b1be-6d3184bfdd25\") " pod="service-telemetry/interconnect-operator-5bb49f789d-qt8kz" Mar 08 00:20:21 crc kubenswrapper[4713]: I0308 00:20:21.251627 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s52x9\" (UniqueName: \"kubernetes.io/projected/37b64282-4957-4a04-b1be-6d3184bfdd25-kube-api-access-s52x9\") pod \"interconnect-operator-5bb49f789d-qt8kz\" (UID: \"37b64282-4957-4a04-b1be-6d3184bfdd25\") " pod="service-telemetry/interconnect-operator-5bb49f789d-qt8kz" Mar 08 00:20:21 crc kubenswrapper[4713]: I0308 00:20:21.273903 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s52x9\" (UniqueName: \"kubernetes.io/projected/37b64282-4957-4a04-b1be-6d3184bfdd25-kube-api-access-s52x9\") pod \"interconnect-operator-5bb49f789d-qt8kz\" (UID: \"37b64282-4957-4a04-b1be-6d3184bfdd25\") " pod="service-telemetry/interconnect-operator-5bb49f789d-qt8kz" Mar 08 00:20:21 crc kubenswrapper[4713]: I0308 00:20:21.325989 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-qt8kz" Mar 08 00:20:25 crc kubenswrapper[4713]: E0308 00:20:25.250234 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 44ec152dd3b5386afab48ac8b39a7d3e0f2f0d40c6f319d2c38fe0147e42cf11 is running failed: container process not found" containerID="44ec152dd3b5386afab48ac8b39a7d3e0f2f0d40c6f319d2c38fe0147e42cf11" cmd=["grpc_health_probe","-addr=:50051"] Mar 08 00:20:25 crc kubenswrapper[4713]: E0308 00:20:25.251444 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 44ec152dd3b5386afab48ac8b39a7d3e0f2f0d40c6f319d2c38fe0147e42cf11 is running failed: container process not found" containerID="44ec152dd3b5386afab48ac8b39a7d3e0f2f0d40c6f319d2c38fe0147e42cf11" cmd=["grpc_health_probe","-addr=:50051"] Mar 08 00:20:25 crc kubenswrapper[4713]: E0308 00:20:25.251741 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 44ec152dd3b5386afab48ac8b39a7d3e0f2f0d40c6f319d2c38fe0147e42cf11 is running failed: container process not found" containerID="44ec152dd3b5386afab48ac8b39a7d3e0f2f0d40c6f319d2c38fe0147e42cf11" cmd=["grpc_health_probe","-addr=:50051"] Mar 08 00:20:25 crc kubenswrapper[4713]: E0308 00:20:25.251770 4713 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 44ec152dd3b5386afab48ac8b39a7d3e0f2f0d40c6f319d2c38fe0147e42cf11 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-75hx9" podUID="d36584b2-9533-4c0e-807f-247e1dbfde71" containerName="registry-server" Mar 08 00:20:27 crc kubenswrapper[4713]: E0308 00:20:27.976575 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:b5c8526d2ae660fe092dd8a7acf18ec4957d5c265890a222f55396fc2cdaeed8" Mar 08 00:20:27 crc kubenswrapper[4713]: E0308 00:20:27.977051 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:perses-operator,Image:registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:b5c8526d2ae660fe092dd8a7acf18ec4957d5c265890a222f55396fc2cdaeed8,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{134217728 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openshift-service-ca,ReadOnly:true,MountPath:/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m68qd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod perses-operator-5bf474d74f-tw72p_openshift-operators(3d1a0596-7485-4376-9630-688753a7abd7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 00:20:27 crc kubenswrapper[4713]: E0308 00:20:27.978228 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/perses-operator-5bf474d74f-tw72p" podUID="3d1a0596-7485-4376-9630-688753a7abd7" Mar 08 00:20:28 crc kubenswrapper[4713]: E0308 00:20:28.091008 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:b5c8526d2ae660fe092dd8a7acf18ec4957d5c265890a222f55396fc2cdaeed8\\\"\"" pod="openshift-operators/perses-operator-5bf474d74f-tw72p" podUID="3d1a0596-7485-4376-9630-688753a7abd7" Mar 08 00:20:28 crc kubenswrapper[4713]: E0308 00:20:28.554693 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea" Mar 08 00:20:28 crc kubenswrapper[4713]: E0308 00:20:28.555149 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5_openshift-operators(860dc604-80d3-4d4b-8b1e-8a430b706882): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 00:20:28 crc kubenswrapper[4713]: E0308 00:20:28.556303 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5" podUID="860dc604-80d3-4d4b-8b1e-8a430b706882" Mar 08 00:20:28 crc kubenswrapper[4713]: E0308 00:20:28.567975 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea" Mar 08 00:20:28 crc kubenswrapper[4713]: E0308 00:20:28.568138 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk_openshift-operators(e2152c14-6da7-4f74-a30e-da9e4e7c1acc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 00:20:28 crc kubenswrapper[4713]: E0308 00:20:28.570434 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk" podUID="e2152c14-6da7-4f74-a30e-da9e4e7c1acc" Mar 08 00:20:28 crc kubenswrapper[4713]: I0308 00:20:28.603492 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:28 crc kubenswrapper[4713]: I0308 00:20:28.745421 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d36584b2-9533-4c0e-807f-247e1dbfde71-utilities\") pod \"d36584b2-9533-4c0e-807f-247e1dbfde71\" (UID: \"d36584b2-9533-4c0e-807f-247e1dbfde71\") " Mar 08 00:20:28 crc kubenswrapper[4713]: I0308 00:20:28.745790 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8thgw\" (UniqueName: \"kubernetes.io/projected/d36584b2-9533-4c0e-807f-247e1dbfde71-kube-api-access-8thgw\") pod \"d36584b2-9533-4c0e-807f-247e1dbfde71\" (UID: \"d36584b2-9533-4c0e-807f-247e1dbfde71\") " Mar 08 00:20:28 crc kubenswrapper[4713]: I0308 00:20:28.745860 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d36584b2-9533-4c0e-807f-247e1dbfde71-catalog-content\") pod \"d36584b2-9533-4c0e-807f-247e1dbfde71\" (UID: \"d36584b2-9533-4c0e-807f-247e1dbfde71\") " Mar 08 00:20:28 crc kubenswrapper[4713]: I0308 00:20:28.746424 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d36584b2-9533-4c0e-807f-247e1dbfde71-utilities" (OuterVolumeSpecName: "utilities") pod "d36584b2-9533-4c0e-807f-247e1dbfde71" (UID: "d36584b2-9533-4c0e-807f-247e1dbfde71"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:20:28 crc kubenswrapper[4713]: I0308 00:20:28.752624 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d36584b2-9533-4c0e-807f-247e1dbfde71-kube-api-access-8thgw" (OuterVolumeSpecName: "kube-api-access-8thgw") pod "d36584b2-9533-4c0e-807f-247e1dbfde71" (UID: "d36584b2-9533-4c0e-807f-247e1dbfde71"). InnerVolumeSpecName "kube-api-access-8thgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:20:28 crc kubenswrapper[4713]: I0308 00:20:28.799548 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d36584b2-9533-4c0e-807f-247e1dbfde71-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d36584b2-9533-4c0e-807f-247e1dbfde71" (UID: "d36584b2-9533-4c0e-807f-247e1dbfde71"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:20:28 crc kubenswrapper[4713]: I0308 00:20:28.848233 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8thgw\" (UniqueName: \"kubernetes.io/projected/d36584b2-9533-4c0e-807f-247e1dbfde71-kube-api-access-8thgw\") on node \"crc\" DevicePath \"\"" Mar 08 00:20:28 crc kubenswrapper[4713]: I0308 00:20:28.848266 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d36584b2-9533-4c0e-807f-247e1dbfde71-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:20:28 crc kubenswrapper[4713]: I0308 00:20:28.848280 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d36584b2-9533-4c0e-807f-247e1dbfde71-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:20:28 crc kubenswrapper[4713]: I0308 00:20:28.865246 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-59b484cb78-hfzmx"] Mar 08 00:20:28 crc kubenswrapper[4713]: W0308 00:20:28.873963 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5a74652_f05c_47a0_8caa_77f544c95128.slice/crio-0fad8a1842d72381e5e630976654126a4888cc95d05ca97fff7ef39fa249bc7e WatchSource:0}: Error finding container 0fad8a1842d72381e5e630976654126a4888cc95d05ca97fff7ef39fa249bc7e: Status 404 returned error can't find the container with id 0fad8a1842d72381e5e630976654126a4888cc95d05ca97fff7ef39fa249bc7e Mar 08 00:20:28 crc kubenswrapper[4713]: I0308 00:20:28.920935 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-qt8kz"] Mar 08 00:20:28 crc kubenswrapper[4713]: W0308 00:20:28.978880 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37b64282_4957_4a04_b1be_6d3184bfdd25.slice/crio-1f8559bfcf41ce6c9de748ed8ebabaf9a5d4a53f68c1c21081f7413cdfa273ed WatchSource:0}: Error finding container 1f8559bfcf41ce6c9de748ed8ebabaf9a5d4a53f68c1c21081f7413cdfa273ed: Status 404 returned error can't find the container with id 1f8559bfcf41ce6c9de748ed8ebabaf9a5d4a53f68c1c21081f7413cdfa273ed Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.089808 4713 generic.go:334] "Generic (PLEG): container finished" podID="f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2" containerID="e17c10baaae1bb1ca1dbac2e430fa8d136422f6db5c8355746dc1a5178fc022a" exitCode=0 Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.090109 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" event={"ID":"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2","Type":"ContainerDied","Data":"e17c10baaae1bb1ca1dbac2e430fa8d136422f6db5c8355746dc1a5178fc022a"} Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.092986 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-59b484cb78-hfzmx" event={"ID":"e5a74652-f05c-47a0-8caa-77f544c95128","Type":"ContainerStarted","Data":"0fad8a1842d72381e5e630976654126a4888cc95d05ca97fff7ef39fa249bc7e"} Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.095421 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75hx9" event={"ID":"d36584b2-9533-4c0e-807f-247e1dbfde71","Type":"ContainerDied","Data":"bbb7c668e198fab933a09095559493804adf46dd60ac7836615cd7c4aef891ab"} Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.095464 4713 scope.go:117] "RemoveContainer" containerID="44ec152dd3b5386afab48ac8b39a7d3e0f2f0d40c6f319d2c38fe0147e42cf11" Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.095562 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.106737 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-4z5hw" event={"ID":"1f48c701-2464-42f6-b2d7-c851ae965f1b","Type":"ContainerStarted","Data":"84eb15f0c45fbb8d9e26d8f14a0b23a2410d8eb034e181470055fc7cad692b3e"} Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.116304 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-v4h4x" event={"ID":"f559f6d0-89dc-4d38-807f-491671408dc7","Type":"ContainerStarted","Data":"e3fb6bbf96dfb0a98be7f50a8fa5aef291423b0bdc000b4e2acd8e1d1996ef3d"} Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.123265 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-v4h4x" Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.139447 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-qt8kz" event={"ID":"37b64282-4957-4a04-b1be-6d3184bfdd25","Type":"ContainerStarted","Data":"1f8559bfcf41ce6c9de748ed8ebabaf9a5d4a53f68c1c21081f7413cdfa273ed"} Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.141891 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-75hx9"] Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.145657 4713 scope.go:117] "RemoveContainer" containerID="203803ad97a614301bd797ddfaef477a72b58ad751b3d2f33a3a8397a7ce8390" Mar 08 00:20:29 crc kubenswrapper[4713]: E0308 00:20:29.145996 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea\\\"\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk" podUID="e2152c14-6da7-4f74-a30e-da9e4e7c1acc" Mar 08 00:20:29 crc kubenswrapper[4713]: E0308 00:20:29.146138 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea\\\"\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5" podUID="860dc604-80d3-4d4b-8b1e-8a430b706882" Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.150445 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-75hx9"] Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.183310 4713 scope.go:117] "RemoveContainer" containerID="637411a4d2fb86d6c5126e6739d735ba75486124da7b040143ab3e4b7241f16f" Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.188426 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-v4h4x" Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.193924 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-4z5hw" podStartSLOduration=2.135290849 podStartE2EDuration="19.193905875s" podCreationTimestamp="2026-03-08 00:20:10 +0000 UTC" firstStartedPulling="2026-03-08 00:20:11.518512217 +0000 UTC m=+865.638144450" lastFinishedPulling="2026-03-08 00:20:28.577127243 +0000 UTC m=+882.696759476" observedRunningTime="2026-03-08 00:20:29.159246786 +0000 UTC m=+883.278879019" watchObservedRunningTime="2026-03-08 00:20:29.193905875 +0000 UTC m=+883.313538108" Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.194342 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-v4h4x" podStartSLOduration=1.526992467 podStartE2EDuration="18.194336947s" podCreationTimestamp="2026-03-08 00:20:11 +0000 UTC" firstStartedPulling="2026-03-08 00:20:11.936535595 +0000 UTC m=+866.056167828" lastFinishedPulling="2026-03-08 00:20:28.603880075 +0000 UTC m=+882.723512308" observedRunningTime="2026-03-08 00:20:29.192131969 +0000 UTC m=+883.311764202" watchObservedRunningTime="2026-03-08 00:20:29.194336947 +0000 UTC m=+883.313969180" Mar 08 00:20:30 crc kubenswrapper[4713]: I0308 00:20:30.149411 4713 generic.go:334] "Generic (PLEG): container finished" podID="f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2" containerID="2b71c90d3947e985d3c60cc0dd27d2933e68e61be980f33dcd93b7b2ed195658" exitCode=0 Mar 08 00:20:30 crc kubenswrapper[4713]: I0308 00:20:30.150696 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" event={"ID":"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2","Type":"ContainerDied","Data":"2b71c90d3947e985d3c60cc0dd27d2933e68e61be980f33dcd93b7b2ed195658"} Mar 08 00:20:30 crc kubenswrapper[4713]: I0308 00:20:30.550291 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d36584b2-9533-4c0e-807f-247e1dbfde71" path="/var/lib/kubelet/pods/d36584b2-9533-4c0e-807f-247e1dbfde71/volumes" Mar 08 00:20:32 crc kubenswrapper[4713]: I0308 00:20:32.650791 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" Mar 08 00:20:32 crc kubenswrapper[4713]: I0308 00:20:32.810204 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-util\") pod \"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2\" (UID: \"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2\") " Mar 08 00:20:32 crc kubenswrapper[4713]: I0308 00:20:32.810244 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-bundle\") pod \"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2\" (UID: \"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2\") " Mar 08 00:20:32 crc kubenswrapper[4713]: I0308 00:20:32.810267 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2546\" (UniqueName: \"kubernetes.io/projected/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-kube-api-access-t2546\") pod \"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2\" (UID: \"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2\") " Mar 08 00:20:32 crc kubenswrapper[4713]: I0308 00:20:32.811613 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-bundle" (OuterVolumeSpecName: "bundle") pod "f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2" (UID: "f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:20:32 crc kubenswrapper[4713]: I0308 00:20:32.831923 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-kube-api-access-t2546" (OuterVolumeSpecName: "kube-api-access-t2546") pod "f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2" (UID: "f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2"). InnerVolumeSpecName "kube-api-access-t2546". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:20:32 crc kubenswrapper[4713]: I0308 00:20:32.837516 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-util" (OuterVolumeSpecName: "util") pod "f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2" (UID: "f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:20:32 crc kubenswrapper[4713]: I0308 00:20:32.911491 4713 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-util\") on node \"crc\" DevicePath \"\"" Mar 08 00:20:32 crc kubenswrapper[4713]: I0308 00:20:32.911788 4713 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 00:20:32 crc kubenswrapper[4713]: I0308 00:20:32.911798 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2546\" (UniqueName: \"kubernetes.io/projected/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-kube-api-access-t2546\") on node \"crc\" DevicePath \"\"" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.174269 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" event={"ID":"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2","Type":"ContainerDied","Data":"8ba57064076cfea14f3b28a190f2d539ac83115e86c3be26c27521876412cfae"} Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.174532 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ba57064076cfea14f3b28a190f2d539ac83115e86c3be26c27521876412cfae" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.174322 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.178272 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-59b484cb78-hfzmx" event={"ID":"e5a74652-f05c-47a0-8caa-77f544c95128","Type":"ContainerStarted","Data":"8dda70cd868de7fdd312496823ff4def57393e55badbe07eb7298f321989ba0d"} Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.199075 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elastic-operator-59b484cb78-hfzmx" podStartSLOduration=13.422078622 podStartE2EDuration="17.199055273s" podCreationTimestamp="2026-03-08 00:20:16 +0000 UTC" firstStartedPulling="2026-03-08 00:20:28.877415981 +0000 UTC m=+882.997048214" lastFinishedPulling="2026-03-08 00:20:32.654392632 +0000 UTC m=+886.774024865" observedRunningTime="2026-03-08 00:20:33.19589595 +0000 UTC m=+887.315528183" watchObservedRunningTime="2026-03-08 00:20:33.199055273 +0000 UTC m=+887.318687516" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.973183 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 08 00:20:33 crc kubenswrapper[4713]: E0308 00:20:33.973394 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2" containerName="extract" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.973405 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2" containerName="extract" Mar 08 00:20:33 crc kubenswrapper[4713]: E0308 00:20:33.973415 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d36584b2-9533-4c0e-807f-247e1dbfde71" containerName="extract-utilities" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.973421 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36584b2-9533-4c0e-807f-247e1dbfde71" containerName="extract-utilities" Mar 08 00:20:33 crc kubenswrapper[4713]: E0308 00:20:33.973430 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d36584b2-9533-4c0e-807f-247e1dbfde71" containerName="registry-server" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.973437 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36584b2-9533-4c0e-807f-247e1dbfde71" containerName="registry-server" Mar 08 00:20:33 crc kubenswrapper[4713]: E0308 00:20:33.973446 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2" containerName="util" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.973452 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2" containerName="util" Mar 08 00:20:33 crc kubenswrapper[4713]: E0308 00:20:33.973463 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2" containerName="pull" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.973470 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2" containerName="pull" Mar 08 00:20:33 crc kubenswrapper[4713]: E0308 00:20:33.973478 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d36584b2-9533-4c0e-807f-247e1dbfde71" containerName="extract-content" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.973483 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36584b2-9533-4c0e-807f-247e1dbfde71" containerName="extract-content" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.973572 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="d36584b2-9533-4c0e-807f-247e1dbfde71" containerName="registry-server" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.973581 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2" containerName="extract" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.974301 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.976114 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-dockercfg-wwn4w" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.980496 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-transport-certs" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.980545 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-unicast-hosts" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.980505 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-internal-users" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.980511 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-config" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.981248 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-remote-ca" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.981305 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-http-certs-internal" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.982099 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-xpack-file-realm" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.982295 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-scripts" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.994461 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.135226 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.135291 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.135323 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.135509 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.135591 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.135629 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.135647 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.135700 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/c8a16625-a3a9-4404-bf4a-073fc8f621b9-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.135720 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.135817 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.135872 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.135894 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.135915 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.135947 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.136019 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.236759 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.236859 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.236898 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.236922 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.236947 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/c8a16625-a3a9-4404-bf4a-073fc8f621b9-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.236967 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.236998 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.237021 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.237042 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.237065 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.237093 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.237138 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.237200 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.237225 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.237248 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.237717 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.240651 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.240936 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.246454 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.246873 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.247025 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.247281 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.247367 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.247393 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.250027 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.250531 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.250580 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/c8a16625-a3a9-4404-bf4a-073fc8f621b9-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.253274 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.253776 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.254042 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.299137 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:38 crc kubenswrapper[4713]: I0308 00:20:38.930066 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 08 00:20:38 crc kubenswrapper[4713]: W0308 00:20:38.942983 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8a16625_a3a9_4404_bf4a_073fc8f621b9.slice/crio-48ef7464e8f3092b00420033b31ae61f9f5d49db06a3aa75a85c9de43703e5d4 WatchSource:0}: Error finding container 48ef7464e8f3092b00420033b31ae61f9f5d49db06a3aa75a85c9de43703e5d4: Status 404 returned error can't find the container with id 48ef7464e8f3092b00420033b31ae61f9f5d49db06a3aa75a85c9de43703e5d4 Mar 08 00:20:39 crc kubenswrapper[4713]: I0308 00:20:39.223855 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-qt8kz" event={"ID":"37b64282-4957-4a04-b1be-6d3184bfdd25","Type":"ContainerStarted","Data":"50b196c3829ced0df677ebe67124d31538db78b4f84c804a90e8deea5e0d9c5b"} Mar 08 00:20:39 crc kubenswrapper[4713]: I0308 00:20:39.224993 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"c8a16625-a3a9-4404-bf4a-073fc8f621b9","Type":"ContainerStarted","Data":"48ef7464e8f3092b00420033b31ae61f9f5d49db06a3aa75a85c9de43703e5d4"} Mar 08 00:20:39 crc kubenswrapper[4713]: I0308 00:20:39.238393 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/interconnect-operator-5bb49f789d-qt8kz" podStartSLOduration=9.471073908 podStartE2EDuration="19.238371073s" podCreationTimestamp="2026-03-08 00:20:20 +0000 UTC" firstStartedPulling="2026-03-08 00:20:28.982448707 +0000 UTC m=+883.102080940" lastFinishedPulling="2026-03-08 00:20:38.749745882 +0000 UTC m=+892.869378105" observedRunningTime="2026-03-08 00:20:39.237378427 +0000 UTC m=+893.357010680" watchObservedRunningTime="2026-03-08 00:20:39.238371073 +0000 UTC m=+893.358003306" Mar 08 00:20:44 crc kubenswrapper[4713]: I0308 00:20:44.262977 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-tw72p" event={"ID":"3d1a0596-7485-4376-9630-688753a7abd7","Type":"ContainerStarted","Data":"75b05d212251dce244b0f532ea49ccbbbd16f64f697b14cfa9946e1746a8a25d"} Mar 08 00:20:44 crc kubenswrapper[4713]: I0308 00:20:44.263772 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-tw72p" Mar 08 00:20:44 crc kubenswrapper[4713]: I0308 00:20:44.264951 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5" event={"ID":"860dc604-80d3-4d4b-8b1e-8a430b706882","Type":"ContainerStarted","Data":"740c8696ab496cee743dc6e46525ad94d869dbfa12fcc83ef80cc86a356fd848"} Mar 08 00:20:44 crc kubenswrapper[4713]: I0308 00:20:44.266357 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk" event={"ID":"e2152c14-6da7-4f74-a30e-da9e4e7c1acc","Type":"ContainerStarted","Data":"338535d61378695b0d9a4695cbc8f430a2abbfb499530b1ae6060fd07be1faa0"} Mar 08 00:20:44 crc kubenswrapper[4713]: I0308 00:20:44.279607 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-tw72p" podStartSLOduration=2.065402395 podStartE2EDuration="33.279592184s" podCreationTimestamp="2026-03-08 00:20:11 +0000 UTC" firstStartedPulling="2026-03-08 00:20:12.092124148 +0000 UTC m=+866.211756381" lastFinishedPulling="2026-03-08 00:20:43.306313937 +0000 UTC m=+897.425946170" observedRunningTime="2026-03-08 00:20:44.2779225 +0000 UTC m=+898.397554753" watchObservedRunningTime="2026-03-08 00:20:44.279592184 +0000 UTC m=+898.399224417" Mar 08 00:20:44 crc kubenswrapper[4713]: I0308 00:20:44.296976 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5" podStartSLOduration=1.9139403910000001 podStartE2EDuration="33.29695833s" podCreationTimestamp="2026-03-08 00:20:11 +0000 UTC" firstStartedPulling="2026-03-08 00:20:11.923677158 +0000 UTC m=+866.043309391" lastFinishedPulling="2026-03-08 00:20:43.306695097 +0000 UTC m=+897.426327330" observedRunningTime="2026-03-08 00:20:44.296494228 +0000 UTC m=+898.416126481" watchObservedRunningTime="2026-03-08 00:20:44.29695833 +0000 UTC m=+898.416590563" Mar 08 00:20:44 crc kubenswrapper[4713]: I0308 00:20:44.314809 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk" podStartSLOduration=1.789947508 podStartE2EDuration="33.314790528s" podCreationTimestamp="2026-03-08 00:20:11 +0000 UTC" firstStartedPulling="2026-03-08 00:20:11.781523048 +0000 UTC m=+865.901155291" lastFinishedPulling="2026-03-08 00:20:43.306366078 +0000 UTC m=+897.425998311" observedRunningTime="2026-03-08 00:20:44.313045932 +0000 UTC m=+898.432678195" watchObservedRunningTime="2026-03-08 00:20:44.314790528 +0000 UTC m=+898.434422771" Mar 08 00:20:49 crc kubenswrapper[4713]: I0308 00:20:49.254234 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-xl6tn"] Mar 08 00:20:49 crc kubenswrapper[4713]: I0308 00:20:49.255681 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-xl6tn" Mar 08 00:20:49 crc kubenswrapper[4713]: I0308 00:20:49.257748 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 08 00:20:49 crc kubenswrapper[4713]: I0308 00:20:49.257879 4713 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-wthx6" Mar 08 00:20:49 crc kubenswrapper[4713]: I0308 00:20:49.262051 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 08 00:20:49 crc kubenswrapper[4713]: I0308 00:20:49.278772 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-xl6tn"] Mar 08 00:20:49 crc kubenswrapper[4713]: I0308 00:20:49.371463 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/be3714bd-7a55-41dd-8a2f-1013ca3fff6a-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-xl6tn\" (UID: \"be3714bd-7a55-41dd-8a2f-1013ca3fff6a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-xl6tn" Mar 08 00:20:49 crc kubenswrapper[4713]: I0308 00:20:49.371632 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts2p5\" (UniqueName: \"kubernetes.io/projected/be3714bd-7a55-41dd-8a2f-1013ca3fff6a-kube-api-access-ts2p5\") pod \"cert-manager-operator-controller-manager-5586865c96-xl6tn\" (UID: \"be3714bd-7a55-41dd-8a2f-1013ca3fff6a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-xl6tn" Mar 08 00:20:49 crc kubenswrapper[4713]: I0308 00:20:49.473549 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts2p5\" (UniqueName: \"kubernetes.io/projected/be3714bd-7a55-41dd-8a2f-1013ca3fff6a-kube-api-access-ts2p5\") pod \"cert-manager-operator-controller-manager-5586865c96-xl6tn\" (UID: \"be3714bd-7a55-41dd-8a2f-1013ca3fff6a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-xl6tn" Mar 08 00:20:49 crc kubenswrapper[4713]: I0308 00:20:49.473629 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/be3714bd-7a55-41dd-8a2f-1013ca3fff6a-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-xl6tn\" (UID: \"be3714bd-7a55-41dd-8a2f-1013ca3fff6a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-xl6tn" Mar 08 00:20:49 crc kubenswrapper[4713]: I0308 00:20:49.474182 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/be3714bd-7a55-41dd-8a2f-1013ca3fff6a-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-xl6tn\" (UID: \"be3714bd-7a55-41dd-8a2f-1013ca3fff6a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-xl6tn" Mar 08 00:20:49 crc kubenswrapper[4713]: I0308 00:20:49.494545 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts2p5\" (UniqueName: \"kubernetes.io/projected/be3714bd-7a55-41dd-8a2f-1013ca3fff6a-kube-api-access-ts2p5\") pod \"cert-manager-operator-controller-manager-5586865c96-xl6tn\" (UID: \"be3714bd-7a55-41dd-8a2f-1013ca3fff6a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-xl6tn" Mar 08 00:20:49 crc kubenswrapper[4713]: I0308 00:20:49.576788 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-xl6tn" Mar 08 00:20:51 crc kubenswrapper[4713]: I0308 00:20:51.804548 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-tw72p" Mar 08 00:20:52 crc kubenswrapper[4713]: I0308 00:20:52.239204 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-xl6tn"] Mar 08 00:20:52 crc kubenswrapper[4713]: W0308 00:20:52.252787 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe3714bd_7a55_41dd_8a2f_1013ca3fff6a.slice/crio-5ab1bcefea26c876d313f360a68e2c81feecce84f5a96dce82c79dbf53300264 WatchSource:0}: Error finding container 5ab1bcefea26c876d313f360a68e2c81feecce84f5a96dce82c79dbf53300264: Status 404 returned error can't find the container with id 5ab1bcefea26c876d313f360a68e2c81feecce84f5a96dce82c79dbf53300264 Mar 08 00:20:52 crc kubenswrapper[4713]: I0308 00:20:52.314593 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-xl6tn" event={"ID":"be3714bd-7a55-41dd-8a2f-1013ca3fff6a","Type":"ContainerStarted","Data":"5ab1bcefea26c876d313f360a68e2c81feecce84f5a96dce82c79dbf53300264"} Mar 08 00:20:57 crc kubenswrapper[4713]: I0308 00:20:57.342922 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"c8a16625-a3a9-4404-bf4a-073fc8f621b9","Type":"ContainerStarted","Data":"901e49eb0ba165fd48834b3d45c1f58767321c8bc2ebd26b8af80bdbc98d5cbb"} Mar 08 00:20:58 crc kubenswrapper[4713]: I0308 00:20:58.755146 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 08 00:20:58 crc kubenswrapper[4713]: I0308 00:20:58.792948 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 08 00:20:59 crc kubenswrapper[4713]: I0308 00:20:59.563620 4713 generic.go:334] "Generic (PLEG): container finished" podID="c8a16625-a3a9-4404-bf4a-073fc8f621b9" containerID="901e49eb0ba165fd48834b3d45c1f58767321c8bc2ebd26b8af80bdbc98d5cbb" exitCode=0 Mar 08 00:20:59 crc kubenswrapper[4713]: I0308 00:20:59.563673 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"c8a16625-a3a9-4404-bf4a-073fc8f621b9","Type":"ContainerDied","Data":"901e49eb0ba165fd48834b3d45c1f58767321c8bc2ebd26b8af80bdbc98d5cbb"} Mar 08 00:21:00 crc kubenswrapper[4713]: I0308 00:21:00.572072 4713 generic.go:334] "Generic (PLEG): container finished" podID="c8a16625-a3a9-4404-bf4a-073fc8f621b9" containerID="3a7dfc251b473197605f0fe9b0475cb4008cde89b99676c28a3f703bf1f74390" exitCode=0 Mar 08 00:21:00 crc kubenswrapper[4713]: I0308 00:21:00.572175 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"c8a16625-a3a9-4404-bf4a-073fc8f621b9","Type":"ContainerDied","Data":"3a7dfc251b473197605f0fe9b0475cb4008cde89b99676c28a3f703bf1f74390"} Mar 08 00:21:01 crc kubenswrapper[4713]: I0308 00:21:01.592627 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"c8a16625-a3a9-4404-bf4a-073fc8f621b9","Type":"ContainerStarted","Data":"074d80db31ea7158b61840a702a227c2c3efb176013854e00e182d894fe5c6c2"} Mar 08 00:21:01 crc kubenswrapper[4713]: I0308 00:21:01.593255 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:21:01 crc kubenswrapper[4713]: I0308 00:21:01.595375 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-xl6tn" event={"ID":"be3714bd-7a55-41dd-8a2f-1013ca3fff6a","Type":"ContainerStarted","Data":"d8664930ddd47f528e3611ce57e6e1187ebfc3923411e19dbebd455e67cae667"} Mar 08 00:21:01 crc kubenswrapper[4713]: I0308 00:21:01.627160 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elasticsearch-es-default-0" podStartSLOduration=15.44471751 podStartE2EDuration="28.62714155s" podCreationTimestamp="2026-03-08 00:20:33 +0000 UTC" firstStartedPulling="2026-03-08 00:20:38.948774935 +0000 UTC m=+893.068407168" lastFinishedPulling="2026-03-08 00:20:52.131198975 +0000 UTC m=+906.250831208" observedRunningTime="2026-03-08 00:21:01.624771328 +0000 UTC m=+915.744403581" watchObservedRunningTime="2026-03-08 00:21:01.62714155 +0000 UTC m=+915.746773803" Mar 08 00:21:01 crc kubenswrapper[4713]: I0308 00:21:01.655560 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-xl6tn" podStartSLOduration=3.828732017 podStartE2EDuration="12.655529725s" podCreationTimestamp="2026-03-08 00:20:49 +0000 UTC" firstStartedPulling="2026-03-08 00:20:52.254387377 +0000 UTC m=+906.374019610" lastFinishedPulling="2026-03-08 00:21:01.081185085 +0000 UTC m=+915.200817318" observedRunningTime="2026-03-08 00:21:01.64885778 +0000 UTC m=+915.768490013" watchObservedRunningTime="2026-03-08 00:21:01.655529725 +0000 UTC m=+915.775161968" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.451429 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.453658 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.455634 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-ptp88" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.455897 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-sys-config" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.456995 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-global-ca" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.457074 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-ca" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.478249 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.546568 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/829dcde5-b1d3-4479-875b-6275ec772c1d-builder-dockercfg-ptp88-push\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.546628 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.546650 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jtf4\" (UniqueName: \"kubernetes.io/projected/829dcde5-b1d3-4479-875b-6275ec772c1d-kube-api-access-4jtf4\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.546683 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.546698 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/829dcde5-b1d3-4479-875b-6275ec772c1d-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.547071 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.547214 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.547294 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/829dcde5-b1d3-4479-875b-6275ec772c1d-builder-dockercfg-ptp88-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.547671 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/829dcde5-b1d3-4479-875b-6275ec772c1d-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.547810 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.547974 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.547999 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.649089 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.649161 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.649188 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.649224 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/829dcde5-b1d3-4479-875b-6275ec772c1d-builder-dockercfg-ptp88-push\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.649250 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.649273 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jtf4\" (UniqueName: \"kubernetes.io/projected/829dcde5-b1d3-4479-875b-6275ec772c1d-kube-api-access-4jtf4\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.649294 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.649313 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/829dcde5-b1d3-4479-875b-6275ec772c1d-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.649341 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.649365 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.649388 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/829dcde5-b1d3-4479-875b-6275ec772c1d-builder-dockercfg-ptp88-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.649421 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/829dcde5-b1d3-4479-875b-6275ec772c1d-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.649730 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/829dcde5-b1d3-4479-875b-6275ec772c1d-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.650163 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.650381 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.650780 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.651673 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/829dcde5-b1d3-4479-875b-6275ec772c1d-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.652144 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.652394 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.652692 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.653067 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.659022 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/829dcde5-b1d3-4479-875b-6275ec772c1d-builder-dockercfg-ptp88-push\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.659408 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/829dcde5-b1d3-4479-875b-6275ec772c1d-builder-dockercfg-ptp88-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.675479 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jtf4\" (UniqueName: \"kubernetes.io/projected/829dcde5-b1d3-4479-875b-6275ec772c1d-kube-api-access-4jtf4\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.778012 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:07 crc kubenswrapper[4713]: I0308 00:21:07.160234 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 08 00:21:07 crc kubenswrapper[4713]: I0308 00:21:07.165459 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 00:21:07 crc kubenswrapper[4713]: I0308 00:21:07.480626 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-9mcfp"] Mar 08 00:21:07 crc kubenswrapper[4713]: I0308 00:21:07.481398 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-9mcfp" Mar 08 00:21:07 crc kubenswrapper[4713]: I0308 00:21:07.485673 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 08 00:21:07 crc kubenswrapper[4713]: I0308 00:21:07.486395 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 08 00:21:07 crc kubenswrapper[4713]: I0308 00:21:07.486589 4713 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-sb25l" Mar 08 00:21:07 crc kubenswrapper[4713]: I0308 00:21:07.498618 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-9mcfp"] Mar 08 00:21:07 crc kubenswrapper[4713]: I0308 00:21:07.563387 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lchml\" (UniqueName: \"kubernetes.io/projected/1a191145-c818-4e84-8bf3-91145fe9db03-kube-api-access-lchml\") pod \"cert-manager-cainjector-5545bd876-9mcfp\" (UID: \"1a191145-c818-4e84-8bf3-91145fe9db03\") " pod="cert-manager/cert-manager-cainjector-5545bd876-9mcfp" Mar 08 00:21:07 crc kubenswrapper[4713]: I0308 00:21:07.563458 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1a191145-c818-4e84-8bf3-91145fe9db03-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-9mcfp\" (UID: \"1a191145-c818-4e84-8bf3-91145fe9db03\") " pod="cert-manager/cert-manager-cainjector-5545bd876-9mcfp" Mar 08 00:21:07 crc kubenswrapper[4713]: I0308 00:21:07.634128 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"829dcde5-b1d3-4479-875b-6275ec772c1d","Type":"ContainerStarted","Data":"2231a649a8f84f0f717170316c535043ecf640c8208085f92f8a7e585f35d9d1"} Mar 08 00:21:07 crc kubenswrapper[4713]: I0308 00:21:07.664684 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lchml\" (UniqueName: \"kubernetes.io/projected/1a191145-c818-4e84-8bf3-91145fe9db03-kube-api-access-lchml\") pod \"cert-manager-cainjector-5545bd876-9mcfp\" (UID: \"1a191145-c818-4e84-8bf3-91145fe9db03\") " pod="cert-manager/cert-manager-cainjector-5545bd876-9mcfp" Mar 08 00:21:07 crc kubenswrapper[4713]: I0308 00:21:07.664735 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1a191145-c818-4e84-8bf3-91145fe9db03-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-9mcfp\" (UID: \"1a191145-c818-4e84-8bf3-91145fe9db03\") " pod="cert-manager/cert-manager-cainjector-5545bd876-9mcfp" Mar 08 00:21:07 crc kubenswrapper[4713]: I0308 00:21:07.683349 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lchml\" (UniqueName: \"kubernetes.io/projected/1a191145-c818-4e84-8bf3-91145fe9db03-kube-api-access-lchml\") pod \"cert-manager-cainjector-5545bd876-9mcfp\" (UID: \"1a191145-c818-4e84-8bf3-91145fe9db03\") " pod="cert-manager/cert-manager-cainjector-5545bd876-9mcfp" Mar 08 00:21:07 crc kubenswrapper[4713]: I0308 00:21:07.687355 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1a191145-c818-4e84-8bf3-91145fe9db03-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-9mcfp\" (UID: \"1a191145-c818-4e84-8bf3-91145fe9db03\") " pod="cert-manager/cert-manager-cainjector-5545bd876-9mcfp" Mar 08 00:21:07 crc kubenswrapper[4713]: I0308 00:21:07.799116 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-9mcfp" Mar 08 00:21:08 crc kubenswrapper[4713]: I0308 00:21:08.254101 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-9mcfp"] Mar 08 00:21:08 crc kubenswrapper[4713]: I0308 00:21:08.642632 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-9mcfp" event={"ID":"1a191145-c818-4e84-8bf3-91145fe9db03","Type":"ContainerStarted","Data":"64a3c67f8f367e00538fbc4bcc945467433c77daaafe440ce0485464a1bbbf12"} Mar 08 00:21:10 crc kubenswrapper[4713]: I0308 00:21:10.903729 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-qmcpl"] Mar 08 00:21:10 crc kubenswrapper[4713]: I0308 00:21:10.904404 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-qmcpl" Mar 08 00:21:10 crc kubenswrapper[4713]: I0308 00:21:10.907438 4713 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-pdqsn" Mar 08 00:21:10 crc kubenswrapper[4713]: I0308 00:21:10.916810 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-qmcpl"] Mar 08 00:21:11 crc kubenswrapper[4713]: I0308 00:21:11.016727 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g78s\" (UniqueName: \"kubernetes.io/projected/2a071bf2-22e7-40f7-976a-74f79abbbd78-kube-api-access-6g78s\") pod \"cert-manager-webhook-6888856db4-qmcpl\" (UID: \"2a071bf2-22e7-40f7-976a-74f79abbbd78\") " pod="cert-manager/cert-manager-webhook-6888856db4-qmcpl" Mar 08 00:21:11 crc kubenswrapper[4713]: I0308 00:21:11.016790 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a071bf2-22e7-40f7-976a-74f79abbbd78-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-qmcpl\" (UID: \"2a071bf2-22e7-40f7-976a-74f79abbbd78\") " pod="cert-manager/cert-manager-webhook-6888856db4-qmcpl" Mar 08 00:21:11 crc kubenswrapper[4713]: I0308 00:21:11.117694 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a071bf2-22e7-40f7-976a-74f79abbbd78-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-qmcpl\" (UID: \"2a071bf2-22e7-40f7-976a-74f79abbbd78\") " pod="cert-manager/cert-manager-webhook-6888856db4-qmcpl" Mar 08 00:21:11 crc kubenswrapper[4713]: I0308 00:21:11.117840 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g78s\" (UniqueName: \"kubernetes.io/projected/2a071bf2-22e7-40f7-976a-74f79abbbd78-kube-api-access-6g78s\") pod \"cert-manager-webhook-6888856db4-qmcpl\" (UID: \"2a071bf2-22e7-40f7-976a-74f79abbbd78\") " pod="cert-manager/cert-manager-webhook-6888856db4-qmcpl" Mar 08 00:21:11 crc kubenswrapper[4713]: I0308 00:21:11.143810 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a071bf2-22e7-40f7-976a-74f79abbbd78-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-qmcpl\" (UID: \"2a071bf2-22e7-40f7-976a-74f79abbbd78\") " pod="cert-manager/cert-manager-webhook-6888856db4-qmcpl" Mar 08 00:21:11 crc kubenswrapper[4713]: I0308 00:21:11.143880 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g78s\" (UniqueName: \"kubernetes.io/projected/2a071bf2-22e7-40f7-976a-74f79abbbd78-kube-api-access-6g78s\") pod \"cert-manager-webhook-6888856db4-qmcpl\" (UID: \"2a071bf2-22e7-40f7-976a-74f79abbbd78\") " pod="cert-manager/cert-manager-webhook-6888856db4-qmcpl" Mar 08 00:21:11 crc kubenswrapper[4713]: I0308 00:21:11.230139 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-qmcpl" Mar 08 00:21:14 crc kubenswrapper[4713]: I0308 00:21:14.399112 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="c8a16625-a3a9-4404-bf4a-073fc8f621b9" containerName="elasticsearch" probeResult="failure" output=< Mar 08 00:21:14 crc kubenswrapper[4713]: {"timestamp": "2026-03-08T00:21:14+00:00", "message": "readiness probe failed", "curl_rc": "7"} Mar 08 00:21:14 crc kubenswrapper[4713]: > Mar 08 00:21:16 crc kubenswrapper[4713]: I0308 00:21:16.654456 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 08 00:21:17 crc kubenswrapper[4713]: I0308 00:21:17.281867 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-qmcpl"] Mar 08 00:21:17 crc kubenswrapper[4713]: I0308 00:21:17.702299 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-qmcpl" event={"ID":"2a071bf2-22e7-40f7-976a-74f79abbbd78","Type":"ContainerStarted","Data":"544dd268fbdb1341d1113d889e9e47f46a4986daed46e78cb9d79bc6f7fd4949"} Mar 08 00:21:17 crc kubenswrapper[4713]: I0308 00:21:17.702375 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-qmcpl" event={"ID":"2a071bf2-22e7-40f7-976a-74f79abbbd78","Type":"ContainerStarted","Data":"2aa4bbe1ec078ce07bf5c1541fe1d0938e383834a58549e3503b572fe162e4bb"} Mar 08 00:21:17 crc kubenswrapper[4713]: I0308 00:21:17.704161 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-qmcpl" Mar 08 00:21:17 crc kubenswrapper[4713]: I0308 00:21:17.705965 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-9mcfp" event={"ID":"1a191145-c818-4e84-8bf3-91145fe9db03","Type":"ContainerStarted","Data":"e09e670039fc04d892e0967e16bcd71371700f4ce93a08c3cbdf3ba7af290f46"} Mar 08 00:21:17 crc kubenswrapper[4713]: I0308 00:21:17.710497 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"829dcde5-b1d3-4479-875b-6275ec772c1d","Type":"ContainerStarted","Data":"811613ee504781645359b4f6bef1bb52d0ef4ab49569fbc78f67d20badcc5d0e"} Mar 08 00:21:17 crc kubenswrapper[4713]: I0308 00:21:17.710652 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-1-build" podUID="829dcde5-b1d3-4479-875b-6275ec772c1d" containerName="manage-dockerfile" containerID="cri-o://811613ee504781645359b4f6bef1bb52d0ef4ab49569fbc78f67d20badcc5d0e" gracePeriod=30 Mar 08 00:21:17 crc kubenswrapper[4713]: I0308 00:21:17.725930 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-qmcpl" podStartSLOduration=7.725903971 podStartE2EDuration="7.725903971s" podCreationTimestamp="2026-03-08 00:21:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:21:17.724932316 +0000 UTC m=+931.844564549" watchObservedRunningTime="2026-03-08 00:21:17.725903971 +0000 UTC m=+931.845536214" Mar 08 00:21:17 crc kubenswrapper[4713]: I0308 00:21:17.748860 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-9mcfp" podStartSLOduration=2.169245771 podStartE2EDuration="10.748810202s" podCreationTimestamp="2026-03-08 00:21:07 +0000 UTC" firstStartedPulling="2026-03-08 00:21:08.268527358 +0000 UTC m=+922.388159591" lastFinishedPulling="2026-03-08 00:21:16.848091789 +0000 UTC m=+930.967724022" observedRunningTime="2026-03-08 00:21:17.745272879 +0000 UTC m=+931.864905132" watchObservedRunningTime="2026-03-08 00:21:17.748810202 +0000 UTC m=+931.868442475" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.108925 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_829dcde5-b1d3-4479-875b-6275ec772c1d/manage-dockerfile/0.log" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.109317 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.132685 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-container-storage-run\") pod \"829dcde5-b1d3-4479-875b-6275ec772c1d\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.132732 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/829dcde5-b1d3-4479-875b-6275ec772c1d-node-pullsecrets\") pod \"829dcde5-b1d3-4479-875b-6275ec772c1d\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.132773 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-ca-bundles\") pod \"829dcde5-b1d3-4479-875b-6275ec772c1d\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.132799 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/829dcde5-b1d3-4479-875b-6275ec772c1d-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "829dcde5-b1d3-4479-875b-6275ec772c1d" (UID: "829dcde5-b1d3-4479-875b-6275ec772c1d"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.132814 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-container-storage-root\") pod \"829dcde5-b1d3-4479-875b-6275ec772c1d\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.132875 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/829dcde5-b1d3-4479-875b-6275ec772c1d-builder-dockercfg-ptp88-push\") pod \"829dcde5-b1d3-4479-875b-6275ec772c1d\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.132922 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-proxy-ca-bundles\") pod \"829dcde5-b1d3-4479-875b-6275ec772c1d\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.132956 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/829dcde5-b1d3-4479-875b-6275ec772c1d-builder-dockercfg-ptp88-pull\") pod \"829dcde5-b1d3-4479-875b-6275ec772c1d\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.132997 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jtf4\" (UniqueName: \"kubernetes.io/projected/829dcde5-b1d3-4479-875b-6275ec772c1d-kube-api-access-4jtf4\") pod \"829dcde5-b1d3-4479-875b-6275ec772c1d\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.133047 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/829dcde5-b1d3-4479-875b-6275ec772c1d-buildcachedir\") pod \"829dcde5-b1d3-4479-875b-6275ec772c1d\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.133072 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-build-blob-cache\") pod \"829dcde5-b1d3-4479-875b-6275ec772c1d\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.133085 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "829dcde5-b1d3-4479-875b-6275ec772c1d" (UID: "829dcde5-b1d3-4479-875b-6275ec772c1d"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.133101 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-buildworkdir\") pod \"829dcde5-b1d3-4479-875b-6275ec772c1d\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.133167 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-system-configs\") pod \"829dcde5-b1d3-4479-875b-6275ec772c1d\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.133422 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "829dcde5-b1d3-4479-875b-6275ec772c1d" (UID: "829dcde5-b1d3-4479-875b-6275ec772c1d"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.133554 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.133567 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.133577 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/829dcde5-b1d3-4479-875b-6275ec772c1d-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.133723 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "829dcde5-b1d3-4479-875b-6275ec772c1d" (UID: "829dcde5-b1d3-4479-875b-6275ec772c1d"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.133764 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/829dcde5-b1d3-4479-875b-6275ec772c1d-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "829dcde5-b1d3-4479-875b-6275ec772c1d" (UID: "829dcde5-b1d3-4479-875b-6275ec772c1d"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.133809 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "829dcde5-b1d3-4479-875b-6275ec772c1d" (UID: "829dcde5-b1d3-4479-875b-6275ec772c1d"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.133971 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "829dcde5-b1d3-4479-875b-6275ec772c1d" (UID: "829dcde5-b1d3-4479-875b-6275ec772c1d"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.134057 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "829dcde5-b1d3-4479-875b-6275ec772c1d" (UID: "829dcde5-b1d3-4479-875b-6275ec772c1d"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.134152 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "829dcde5-b1d3-4479-875b-6275ec772c1d" (UID: "829dcde5-b1d3-4479-875b-6275ec772c1d"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.138663 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/829dcde5-b1d3-4479-875b-6275ec772c1d-kube-api-access-4jtf4" (OuterVolumeSpecName: "kube-api-access-4jtf4") pod "829dcde5-b1d3-4479-875b-6275ec772c1d" (UID: "829dcde5-b1d3-4479-875b-6275ec772c1d"). InnerVolumeSpecName "kube-api-access-4jtf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.140069 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/829dcde5-b1d3-4479-875b-6275ec772c1d-builder-dockercfg-ptp88-pull" (OuterVolumeSpecName: "builder-dockercfg-ptp88-pull") pod "829dcde5-b1d3-4479-875b-6275ec772c1d" (UID: "829dcde5-b1d3-4479-875b-6275ec772c1d"). InnerVolumeSpecName "builder-dockercfg-ptp88-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.140993 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/829dcde5-b1d3-4479-875b-6275ec772c1d-builder-dockercfg-ptp88-push" (OuterVolumeSpecName: "builder-dockercfg-ptp88-push") pod "829dcde5-b1d3-4479-875b-6275ec772c1d" (UID: "829dcde5-b1d3-4479-875b-6275ec772c1d"). InnerVolumeSpecName "builder-dockercfg-ptp88-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.234590 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.234630 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.234651 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/829dcde5-b1d3-4479-875b-6275ec772c1d-builder-dockercfg-ptp88-push\") on node \"crc\" DevicePath \"\"" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.234664 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.234675 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/829dcde5-b1d3-4479-875b-6275ec772c1d-builder-dockercfg-ptp88-pull\") on node \"crc\" DevicePath \"\"" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.234685 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jtf4\" (UniqueName: \"kubernetes.io/projected/829dcde5-b1d3-4479-875b-6275ec772c1d-kube-api-access-4jtf4\") on node \"crc\" DevicePath \"\"" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.234695 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/829dcde5-b1d3-4479-875b-6275ec772c1d-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.234705 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.234714 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.296336 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 08 00:21:18 crc kubenswrapper[4713]: E0308 00:21:18.296700 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="829dcde5-b1d3-4479-875b-6275ec772c1d" containerName="manage-dockerfile" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.296719 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="829dcde5-b1d3-4479-875b-6275ec772c1d" containerName="manage-dockerfile" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.296865 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="829dcde5-b1d3-4479-875b-6275ec772c1d" containerName="manage-dockerfile" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.298007 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.303345 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-global-ca" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.303350 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-sys-config" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.303681 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-ca" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.326254 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.372283 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/1fc148b4-f954-4ef0-8c15-bbff85220029-builder-dockercfg-ptp88-push\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.372331 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1fc148b4-f954-4ef0-8c15-bbff85220029-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.372361 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.372384 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1fc148b4-f954-4ef0-8c15-bbff85220029-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.372409 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.372451 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dclxd\" (UniqueName: \"kubernetes.io/projected/1fc148b4-f954-4ef0-8c15-bbff85220029-kube-api-access-dclxd\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.372596 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.372635 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.372707 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.372738 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.372898 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/1fc148b4-f954-4ef0-8c15-bbff85220029-builder-dockercfg-ptp88-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.373000 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.474302 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/1fc148b4-f954-4ef0-8c15-bbff85220029-builder-dockercfg-ptp88-push\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.474357 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1fc148b4-f954-4ef0-8c15-bbff85220029-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.474383 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.474403 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1fc148b4-f954-4ef0-8c15-bbff85220029-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.474434 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.474474 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dclxd\" (UniqueName: \"kubernetes.io/projected/1fc148b4-f954-4ef0-8c15-bbff85220029-kube-api-access-dclxd\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.474508 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.474530 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.474561 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.474578 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.474604 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/1fc148b4-f954-4ef0-8c15-bbff85220029-builder-dockercfg-ptp88-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.474629 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.475624 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.476363 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1fc148b4-f954-4ef0-8c15-bbff85220029-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.476475 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1fc148b4-f954-4ef0-8c15-bbff85220029-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.476690 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.477108 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.477205 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.477408 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.477749 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.478712 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.483332 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/1fc148b4-f954-4ef0-8c15-bbff85220029-builder-dockercfg-ptp88-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.483664 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/1fc148b4-f954-4ef0-8c15-bbff85220029-builder-dockercfg-ptp88-push\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.502563 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dclxd\" (UniqueName: \"kubernetes.io/projected/1fc148b4-f954-4ef0-8c15-bbff85220029-kube-api-access-dclxd\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.613082 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.717096 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_829dcde5-b1d3-4479-875b-6275ec772c1d/manage-dockerfile/0.log" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.717425 4713 generic.go:334] "Generic (PLEG): container finished" podID="829dcde5-b1d3-4479-875b-6275ec772c1d" containerID="811613ee504781645359b4f6bef1bb52d0ef4ab49569fbc78f67d20badcc5d0e" exitCode=1 Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.717521 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"829dcde5-b1d3-4479-875b-6275ec772c1d","Type":"ContainerDied","Data":"811613ee504781645359b4f6bef1bb52d0ef4ab49569fbc78f67d20badcc5d0e"} Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.717564 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"829dcde5-b1d3-4479-875b-6275ec772c1d","Type":"ContainerDied","Data":"2231a649a8f84f0f717170316c535043ecf640c8208085f92f8a7e585f35d9d1"} Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.717587 4713 scope.go:117] "RemoveContainer" containerID="811613ee504781645359b4f6bef1bb52d0ef4ab49569fbc78f67d20badcc5d0e" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.717710 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.754549 4713 scope.go:117] "RemoveContainer" containerID="811613ee504781645359b4f6bef1bb52d0ef4ab49569fbc78f67d20badcc5d0e" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.754648 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 08 00:21:18 crc kubenswrapper[4713]: E0308 00:21:18.755701 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"811613ee504781645359b4f6bef1bb52d0ef4ab49569fbc78f67d20badcc5d0e\": container with ID starting with 811613ee504781645359b4f6bef1bb52d0ef4ab49569fbc78f67d20badcc5d0e not found: ID does not exist" containerID="811613ee504781645359b4f6bef1bb52d0ef4ab49569fbc78f67d20badcc5d0e" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.755741 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"811613ee504781645359b4f6bef1bb52d0ef4ab49569fbc78f67d20badcc5d0e"} err="failed to get container status \"811613ee504781645359b4f6bef1bb52d0ef4ab49569fbc78f67d20badcc5d0e\": rpc error: code = NotFound desc = could not find container \"811613ee504781645359b4f6bef1bb52d0ef4ab49569fbc78f67d20badcc5d0e\": container with ID starting with 811613ee504781645359b4f6bef1bb52d0ef4ab49569fbc78f67d20badcc5d0e not found: ID does not exist" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.756912 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.873441 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 08 00:21:19 crc kubenswrapper[4713]: I0308 00:21:19.725733 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"1fc148b4-f954-4ef0-8c15-bbff85220029","Type":"ContainerStarted","Data":"31d5c12beb190c33e02771389167e0b587993d663944247e0b12745d5272bbcb"} Mar 08 00:21:19 crc kubenswrapper[4713]: I0308 00:21:19.726058 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"1fc148b4-f954-4ef0-8c15-bbff85220029","Type":"ContainerStarted","Data":"66baa3590517de49b5509c8015457716863174fefd4cceac80014e6ff5386a9e"} Mar 08 00:21:20 crc kubenswrapper[4713]: I0308 00:21:20.298519 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:21:20 crc kubenswrapper[4713]: I0308 00:21:20.549015 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="829dcde5-b1d3-4479-875b-6275ec772c1d" path="/var/lib/kubelet/pods/829dcde5-b1d3-4479-875b-6275ec772c1d/volumes" Mar 08 00:21:24 crc kubenswrapper[4713]: I0308 00:21:24.441597 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-gkqzr"] Mar 08 00:21:24 crc kubenswrapper[4713]: I0308 00:21:24.442810 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-gkqzr" Mar 08 00:21:24 crc kubenswrapper[4713]: I0308 00:21:24.447868 4713 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-vqbzd" Mar 08 00:21:24 crc kubenswrapper[4713]: I0308 00:21:24.460459 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-gkqzr"] Mar 08 00:21:24 crc kubenswrapper[4713]: I0308 00:21:24.548990 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d4f51ae9-d2ab-4704-aeeb-5710aceda4f0-bound-sa-token\") pod \"cert-manager-545d4d4674-gkqzr\" (UID: \"d4f51ae9-d2ab-4704-aeeb-5710aceda4f0\") " pod="cert-manager/cert-manager-545d4d4674-gkqzr" Mar 08 00:21:24 crc kubenswrapper[4713]: I0308 00:21:24.549078 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d74lj\" (UniqueName: \"kubernetes.io/projected/d4f51ae9-d2ab-4704-aeeb-5710aceda4f0-kube-api-access-d74lj\") pod \"cert-manager-545d4d4674-gkqzr\" (UID: \"d4f51ae9-d2ab-4704-aeeb-5710aceda4f0\") " pod="cert-manager/cert-manager-545d4d4674-gkqzr" Mar 08 00:21:24 crc kubenswrapper[4713]: I0308 00:21:24.650779 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d74lj\" (UniqueName: \"kubernetes.io/projected/d4f51ae9-d2ab-4704-aeeb-5710aceda4f0-kube-api-access-d74lj\") pod \"cert-manager-545d4d4674-gkqzr\" (UID: \"d4f51ae9-d2ab-4704-aeeb-5710aceda4f0\") " pod="cert-manager/cert-manager-545d4d4674-gkqzr" Mar 08 00:21:24 crc kubenswrapper[4713]: I0308 00:21:24.650953 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d4f51ae9-d2ab-4704-aeeb-5710aceda4f0-bound-sa-token\") pod \"cert-manager-545d4d4674-gkqzr\" (UID: \"d4f51ae9-d2ab-4704-aeeb-5710aceda4f0\") " pod="cert-manager/cert-manager-545d4d4674-gkqzr" Mar 08 00:21:24 crc kubenswrapper[4713]: I0308 00:21:24.671233 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d4f51ae9-d2ab-4704-aeeb-5710aceda4f0-bound-sa-token\") pod \"cert-manager-545d4d4674-gkqzr\" (UID: \"d4f51ae9-d2ab-4704-aeeb-5710aceda4f0\") " pod="cert-manager/cert-manager-545d4d4674-gkqzr" Mar 08 00:21:24 crc kubenswrapper[4713]: I0308 00:21:24.671694 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d74lj\" (UniqueName: \"kubernetes.io/projected/d4f51ae9-d2ab-4704-aeeb-5710aceda4f0-kube-api-access-d74lj\") pod \"cert-manager-545d4d4674-gkqzr\" (UID: \"d4f51ae9-d2ab-4704-aeeb-5710aceda4f0\") " pod="cert-manager/cert-manager-545d4d4674-gkqzr" Mar 08 00:21:24 crc kubenswrapper[4713]: I0308 00:21:24.758595 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-gkqzr" Mar 08 00:21:25 crc kubenswrapper[4713]: W0308 00:21:25.025108 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4f51ae9_d2ab_4704_aeeb_5710aceda4f0.slice/crio-f17e527d2b15f43635f9fcd8d7d1916de66dae9b3a81c91d8fc931cd375c4ee4 WatchSource:0}: Error finding container f17e527d2b15f43635f9fcd8d7d1916de66dae9b3a81c91d8fc931cd375c4ee4: Status 404 returned error can't find the container with id f17e527d2b15f43635f9fcd8d7d1916de66dae9b3a81c91d8fc931cd375c4ee4 Mar 08 00:21:25 crc kubenswrapper[4713]: I0308 00:21:25.037512 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-gkqzr"] Mar 08 00:21:25 crc kubenswrapper[4713]: I0308 00:21:25.761936 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-gkqzr" event={"ID":"d4f51ae9-d2ab-4704-aeeb-5710aceda4f0","Type":"ContainerStarted","Data":"c33fc1e38223c47754a1f717f6ccfef29fa174a7c0c22ce7c73011fe0b21b27a"} Mar 08 00:21:25 crc kubenswrapper[4713]: I0308 00:21:25.761983 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-gkqzr" event={"ID":"d4f51ae9-d2ab-4704-aeeb-5710aceda4f0","Type":"ContainerStarted","Data":"f17e527d2b15f43635f9fcd8d7d1916de66dae9b3a81c91d8fc931cd375c4ee4"} Mar 08 00:21:25 crc kubenswrapper[4713]: I0308 00:21:25.782055 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-gkqzr" podStartSLOduration=1.7820366779999999 podStartE2EDuration="1.782036678s" podCreationTimestamp="2026-03-08 00:21:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:21:25.778797673 +0000 UTC m=+939.898429916" watchObservedRunningTime="2026-03-08 00:21:25.782036678 +0000 UTC m=+939.901668901" Mar 08 00:21:26 crc kubenswrapper[4713]: I0308 00:21:26.234305 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-qmcpl" Mar 08 00:21:26 crc kubenswrapper[4713]: I0308 00:21:26.776586 4713 generic.go:334] "Generic (PLEG): container finished" podID="1fc148b4-f954-4ef0-8c15-bbff85220029" containerID="31d5c12beb190c33e02771389167e0b587993d663944247e0b12745d5272bbcb" exitCode=0 Mar 08 00:21:26 crc kubenswrapper[4713]: I0308 00:21:26.776672 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"1fc148b4-f954-4ef0-8c15-bbff85220029","Type":"ContainerDied","Data":"31d5c12beb190c33e02771389167e0b587993d663944247e0b12745d5272bbcb"} Mar 08 00:21:27 crc kubenswrapper[4713]: I0308 00:21:27.784209 4713 generic.go:334] "Generic (PLEG): container finished" podID="1fc148b4-f954-4ef0-8c15-bbff85220029" containerID="3c984fc98d0cb19e91c6d652b313bc38124a2b4f356ef60376d103344ae061d8" exitCode=0 Mar 08 00:21:27 crc kubenswrapper[4713]: I0308 00:21:27.784278 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"1fc148b4-f954-4ef0-8c15-bbff85220029","Type":"ContainerDied","Data":"3c984fc98d0cb19e91c6d652b313bc38124a2b4f356ef60376d103344ae061d8"} Mar 08 00:21:27 crc kubenswrapper[4713]: I0308 00:21:27.834591 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-2-build_1fc148b4-f954-4ef0-8c15-bbff85220029/manage-dockerfile/0.log" Mar 08 00:21:28 crc kubenswrapper[4713]: I0308 00:21:28.795377 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"1fc148b4-f954-4ef0-8c15-bbff85220029","Type":"ContainerStarted","Data":"ee6703f14aab020c6c6eebf428313e07ac09472749abcb07ec3ca3caf3e5ca7f"} Mar 08 00:21:28 crc kubenswrapper[4713]: I0308 00:21:28.822466 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-2-build" podStartSLOduration=10.822436721999999 podStartE2EDuration="10.822436722s" podCreationTimestamp="2026-03-08 00:21:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:21:28.82008209 +0000 UTC m=+942.939714343" watchObservedRunningTime="2026-03-08 00:21:28.822436722 +0000 UTC m=+942.942068955" Mar 08 00:22:00 crc kubenswrapper[4713]: I0308 00:22:00.133426 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548822-zwqb8"] Mar 08 00:22:00 crc kubenswrapper[4713]: I0308 00:22:00.135263 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548822-zwqb8" Mar 08 00:22:00 crc kubenswrapper[4713]: I0308 00:22:00.139823 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548822-zwqb8"] Mar 08 00:22:00 crc kubenswrapper[4713]: I0308 00:22:00.140320 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:22:00 crc kubenswrapper[4713]: I0308 00:22:00.140437 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:22:00 crc kubenswrapper[4713]: I0308 00:22:00.146434 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:22:00 crc kubenswrapper[4713]: I0308 00:22:00.232981 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwdlz\" (UniqueName: \"kubernetes.io/projected/985fdd12-7009-419a-8098-df4c84849d22-kube-api-access-qwdlz\") pod \"auto-csr-approver-29548822-zwqb8\" (UID: \"985fdd12-7009-419a-8098-df4c84849d22\") " pod="openshift-infra/auto-csr-approver-29548822-zwqb8" Mar 08 00:22:00 crc kubenswrapper[4713]: I0308 00:22:00.334536 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwdlz\" (UniqueName: \"kubernetes.io/projected/985fdd12-7009-419a-8098-df4c84849d22-kube-api-access-qwdlz\") pod \"auto-csr-approver-29548822-zwqb8\" (UID: \"985fdd12-7009-419a-8098-df4c84849d22\") " pod="openshift-infra/auto-csr-approver-29548822-zwqb8" Mar 08 00:22:00 crc kubenswrapper[4713]: I0308 00:22:00.361786 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwdlz\" (UniqueName: \"kubernetes.io/projected/985fdd12-7009-419a-8098-df4c84849d22-kube-api-access-qwdlz\") pod \"auto-csr-approver-29548822-zwqb8\" (UID: \"985fdd12-7009-419a-8098-df4c84849d22\") " pod="openshift-infra/auto-csr-approver-29548822-zwqb8" Mar 08 00:22:00 crc kubenswrapper[4713]: I0308 00:22:00.456787 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548822-zwqb8" Mar 08 00:22:00 crc kubenswrapper[4713]: I0308 00:22:00.668425 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548822-zwqb8"] Mar 08 00:22:01 crc kubenswrapper[4713]: I0308 00:22:01.045171 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548822-zwqb8" event={"ID":"985fdd12-7009-419a-8098-df4c84849d22","Type":"ContainerStarted","Data":"354ae8922f628bfcf3a4b66f5eb2f2a9e6348f730d75c4ba294e9425c8c90d10"} Mar 08 00:22:02 crc kubenswrapper[4713]: I0308 00:22:02.052180 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548822-zwqb8" event={"ID":"985fdd12-7009-419a-8098-df4c84849d22","Type":"ContainerStarted","Data":"03f2240ea47d4e1505d29677bf54b0934fc0985bf6c6ce2acf97701158af0125"} Mar 08 00:22:02 crc kubenswrapper[4713]: I0308 00:22:02.065512 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29548822-zwqb8" podStartSLOduration=1.065513952 podStartE2EDuration="2.065491307s" podCreationTimestamp="2026-03-08 00:22:00 +0000 UTC" firstStartedPulling="2026-03-08 00:22:00.671811121 +0000 UTC m=+974.791443354" lastFinishedPulling="2026-03-08 00:22:01.671788446 +0000 UTC m=+975.791420709" observedRunningTime="2026-03-08 00:22:02.064753998 +0000 UTC m=+976.184386231" watchObservedRunningTime="2026-03-08 00:22:02.065491307 +0000 UTC m=+976.185123540" Mar 08 00:22:03 crc kubenswrapper[4713]: I0308 00:22:03.062270 4713 generic.go:334] "Generic (PLEG): container finished" podID="985fdd12-7009-419a-8098-df4c84849d22" containerID="03f2240ea47d4e1505d29677bf54b0934fc0985bf6c6ce2acf97701158af0125" exitCode=0 Mar 08 00:22:03 crc kubenswrapper[4713]: I0308 00:22:03.062318 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548822-zwqb8" event={"ID":"985fdd12-7009-419a-8098-df4c84849d22","Type":"ContainerDied","Data":"03f2240ea47d4e1505d29677bf54b0934fc0985bf6c6ce2acf97701158af0125"} Mar 08 00:22:04 crc kubenswrapper[4713]: I0308 00:22:04.322328 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548822-zwqb8" Mar 08 00:22:04 crc kubenswrapper[4713]: I0308 00:22:04.400506 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwdlz\" (UniqueName: \"kubernetes.io/projected/985fdd12-7009-419a-8098-df4c84849d22-kube-api-access-qwdlz\") pod \"985fdd12-7009-419a-8098-df4c84849d22\" (UID: \"985fdd12-7009-419a-8098-df4c84849d22\") " Mar 08 00:22:04 crc kubenswrapper[4713]: I0308 00:22:04.405793 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/985fdd12-7009-419a-8098-df4c84849d22-kube-api-access-qwdlz" (OuterVolumeSpecName: "kube-api-access-qwdlz") pod "985fdd12-7009-419a-8098-df4c84849d22" (UID: "985fdd12-7009-419a-8098-df4c84849d22"). InnerVolumeSpecName "kube-api-access-qwdlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:22:04 crc kubenswrapper[4713]: I0308 00:22:04.502430 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwdlz\" (UniqueName: \"kubernetes.io/projected/985fdd12-7009-419a-8098-df4c84849d22-kube-api-access-qwdlz\") on node \"crc\" DevicePath \"\"" Mar 08 00:22:05 crc kubenswrapper[4713]: I0308 00:22:05.083019 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548822-zwqb8" event={"ID":"985fdd12-7009-419a-8098-df4c84849d22","Type":"ContainerDied","Data":"354ae8922f628bfcf3a4b66f5eb2f2a9e6348f730d75c4ba294e9425c8c90d10"} Mar 08 00:22:05 crc kubenswrapper[4713]: I0308 00:22:05.083071 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="354ae8922f628bfcf3a4b66f5eb2f2a9e6348f730d75c4ba294e9425c8c90d10" Mar 08 00:22:05 crc kubenswrapper[4713]: I0308 00:22:05.083143 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548822-zwqb8" Mar 08 00:22:05 crc kubenswrapper[4713]: I0308 00:22:05.125525 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29548816-gtsk5"] Mar 08 00:22:05 crc kubenswrapper[4713]: I0308 00:22:05.133844 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29548816-gtsk5"] Mar 08 00:22:06 crc kubenswrapper[4713]: I0308 00:22:06.550971 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4623866-795f-438d-9b3b-66afb30f9657" path="/var/lib/kubelet/pods/e4623866-795f-438d-9b3b-66afb30f9657/volumes" Mar 08 00:22:14 crc kubenswrapper[4713]: I0308 00:22:14.877416 4713 scope.go:117] "RemoveContainer" containerID="88536119c11c7644e16e9556af63bc5f387d89253eeaf6cbd55a1eddd526755e" Mar 08 00:22:28 crc kubenswrapper[4713]: I0308 00:22:28.126662 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9zwsx"] Mar 08 00:22:28 crc kubenswrapper[4713]: E0308 00:22:28.127516 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="985fdd12-7009-419a-8098-df4c84849d22" containerName="oc" Mar 08 00:22:28 crc kubenswrapper[4713]: I0308 00:22:28.127530 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="985fdd12-7009-419a-8098-df4c84849d22" containerName="oc" Mar 08 00:22:28 crc kubenswrapper[4713]: I0308 00:22:28.127648 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="985fdd12-7009-419a-8098-df4c84849d22" containerName="oc" Mar 08 00:22:28 crc kubenswrapper[4713]: I0308 00:22:28.128473 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:28 crc kubenswrapper[4713]: I0308 00:22:28.142368 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9zwsx"] Mar 08 00:22:28 crc kubenswrapper[4713]: I0308 00:22:28.250698 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6a28e60-a4ea-42bc-baaf-d90f095194db-catalog-content\") pod \"community-operators-9zwsx\" (UID: \"e6a28e60-a4ea-42bc-baaf-d90f095194db\") " pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:28 crc kubenswrapper[4713]: I0308 00:22:28.250944 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt5wn\" (UniqueName: \"kubernetes.io/projected/e6a28e60-a4ea-42bc-baaf-d90f095194db-kube-api-access-tt5wn\") pod \"community-operators-9zwsx\" (UID: \"e6a28e60-a4ea-42bc-baaf-d90f095194db\") " pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:28 crc kubenswrapper[4713]: I0308 00:22:28.251238 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6a28e60-a4ea-42bc-baaf-d90f095194db-utilities\") pod \"community-operators-9zwsx\" (UID: \"e6a28e60-a4ea-42bc-baaf-d90f095194db\") " pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:28 crc kubenswrapper[4713]: I0308 00:22:28.352246 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6a28e60-a4ea-42bc-baaf-d90f095194db-utilities\") pod \"community-operators-9zwsx\" (UID: \"e6a28e60-a4ea-42bc-baaf-d90f095194db\") " pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:28 crc kubenswrapper[4713]: I0308 00:22:28.352308 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6a28e60-a4ea-42bc-baaf-d90f095194db-catalog-content\") pod \"community-operators-9zwsx\" (UID: \"e6a28e60-a4ea-42bc-baaf-d90f095194db\") " pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:28 crc kubenswrapper[4713]: I0308 00:22:28.352369 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt5wn\" (UniqueName: \"kubernetes.io/projected/e6a28e60-a4ea-42bc-baaf-d90f095194db-kube-api-access-tt5wn\") pod \"community-operators-9zwsx\" (UID: \"e6a28e60-a4ea-42bc-baaf-d90f095194db\") " pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:28 crc kubenswrapper[4713]: I0308 00:22:28.352866 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6a28e60-a4ea-42bc-baaf-d90f095194db-utilities\") pod \"community-operators-9zwsx\" (UID: \"e6a28e60-a4ea-42bc-baaf-d90f095194db\") " pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:28 crc kubenswrapper[4713]: I0308 00:22:28.352927 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6a28e60-a4ea-42bc-baaf-d90f095194db-catalog-content\") pod \"community-operators-9zwsx\" (UID: \"e6a28e60-a4ea-42bc-baaf-d90f095194db\") " pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:28 crc kubenswrapper[4713]: I0308 00:22:28.383027 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt5wn\" (UniqueName: \"kubernetes.io/projected/e6a28e60-a4ea-42bc-baaf-d90f095194db-kube-api-access-tt5wn\") pod \"community-operators-9zwsx\" (UID: \"e6a28e60-a4ea-42bc-baaf-d90f095194db\") " pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:28 crc kubenswrapper[4713]: I0308 00:22:28.488280 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:28 crc kubenswrapper[4713]: I0308 00:22:28.790242 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9zwsx"] Mar 08 00:22:29 crc kubenswrapper[4713]: I0308 00:22:29.252268 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zwsx" event={"ID":"e6a28e60-a4ea-42bc-baaf-d90f095194db","Type":"ContainerStarted","Data":"4f79f8d113b33a8ed31c712e6bc24fadf44173317f45c47a3aecbb3f986bd86c"} Mar 08 00:22:30 crc kubenswrapper[4713]: I0308 00:22:30.265384 4713 generic.go:334] "Generic (PLEG): container finished" podID="e6a28e60-a4ea-42bc-baaf-d90f095194db" containerID="7f9113f3ab2d0e883be9558722f019c38fe9e8388dad6e267f07a8b9c81b4957" exitCode=0 Mar 08 00:22:30 crc kubenswrapper[4713]: I0308 00:22:30.265550 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zwsx" event={"ID":"e6a28e60-a4ea-42bc-baaf-d90f095194db","Type":"ContainerDied","Data":"7f9113f3ab2d0e883be9558722f019c38fe9e8388dad6e267f07a8b9c81b4957"} Mar 08 00:22:32 crc kubenswrapper[4713]: I0308 00:22:32.284294 4713 generic.go:334] "Generic (PLEG): container finished" podID="e6a28e60-a4ea-42bc-baaf-d90f095194db" containerID="86a8568ad4d8d26eb934bfa1de549b2580203db400351de12095645eb85258ce" exitCode=0 Mar 08 00:22:32 crc kubenswrapper[4713]: I0308 00:22:32.284378 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zwsx" event={"ID":"e6a28e60-a4ea-42bc-baaf-d90f095194db","Type":"ContainerDied","Data":"86a8568ad4d8d26eb934bfa1de549b2580203db400351de12095645eb85258ce"} Mar 08 00:22:33 crc kubenswrapper[4713]: I0308 00:22:33.293883 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zwsx" event={"ID":"e6a28e60-a4ea-42bc-baaf-d90f095194db","Type":"ContainerStarted","Data":"92c72a89a107928ed4e894fdab18396694847871d111ea13884b19906757f108"} Mar 08 00:22:33 crc kubenswrapper[4713]: I0308 00:22:33.317256 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9zwsx" podStartSLOduration=2.92886965 podStartE2EDuration="5.317235581s" podCreationTimestamp="2026-03-08 00:22:28 +0000 UTC" firstStartedPulling="2026-03-08 00:22:30.269670132 +0000 UTC m=+1004.389302365" lastFinishedPulling="2026-03-08 00:22:32.658036063 +0000 UTC m=+1006.777668296" observedRunningTime="2026-03-08 00:22:33.316794559 +0000 UTC m=+1007.436426802" watchObservedRunningTime="2026-03-08 00:22:33.317235581 +0000 UTC m=+1007.436867814" Mar 08 00:22:34 crc kubenswrapper[4713]: I0308 00:22:34.501590 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:22:34 crc kubenswrapper[4713]: I0308 00:22:34.501656 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:22:38 crc kubenswrapper[4713]: I0308 00:22:38.488603 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:38 crc kubenswrapper[4713]: I0308 00:22:38.489236 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:38 crc kubenswrapper[4713]: I0308 00:22:38.553032 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:39 crc kubenswrapper[4713]: I0308 00:22:39.381301 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:39 crc kubenswrapper[4713]: I0308 00:22:39.418247 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9zwsx"] Mar 08 00:22:41 crc kubenswrapper[4713]: I0308 00:22:41.337894 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9zwsx" podUID="e6a28e60-a4ea-42bc-baaf-d90f095194db" containerName="registry-server" containerID="cri-o://92c72a89a107928ed4e894fdab18396694847871d111ea13884b19906757f108" gracePeriod=2 Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.153422 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.199201 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6a28e60-a4ea-42bc-baaf-d90f095194db-catalog-content\") pod \"e6a28e60-a4ea-42bc-baaf-d90f095194db\" (UID: \"e6a28e60-a4ea-42bc-baaf-d90f095194db\") " Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.199264 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6a28e60-a4ea-42bc-baaf-d90f095194db-utilities\") pod \"e6a28e60-a4ea-42bc-baaf-d90f095194db\" (UID: \"e6a28e60-a4ea-42bc-baaf-d90f095194db\") " Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.199297 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt5wn\" (UniqueName: \"kubernetes.io/projected/e6a28e60-a4ea-42bc-baaf-d90f095194db-kube-api-access-tt5wn\") pod \"e6a28e60-a4ea-42bc-baaf-d90f095194db\" (UID: \"e6a28e60-a4ea-42bc-baaf-d90f095194db\") " Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.200410 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6a28e60-a4ea-42bc-baaf-d90f095194db-utilities" (OuterVolumeSpecName: "utilities") pod "e6a28e60-a4ea-42bc-baaf-d90f095194db" (UID: "e6a28e60-a4ea-42bc-baaf-d90f095194db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.205082 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6a28e60-a4ea-42bc-baaf-d90f095194db-kube-api-access-tt5wn" (OuterVolumeSpecName: "kube-api-access-tt5wn") pod "e6a28e60-a4ea-42bc-baaf-d90f095194db" (UID: "e6a28e60-a4ea-42bc-baaf-d90f095194db"). InnerVolumeSpecName "kube-api-access-tt5wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.254805 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6a28e60-a4ea-42bc-baaf-d90f095194db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6a28e60-a4ea-42bc-baaf-d90f095194db" (UID: "e6a28e60-a4ea-42bc-baaf-d90f095194db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.300612 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6a28e60-a4ea-42bc-baaf-d90f095194db-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.300656 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt5wn\" (UniqueName: \"kubernetes.io/projected/e6a28e60-a4ea-42bc-baaf-d90f095194db-kube-api-access-tt5wn\") on node \"crc\" DevicePath \"\"" Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.300669 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6a28e60-a4ea-42bc-baaf-d90f095194db-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.364909 4713 generic.go:334] "Generic (PLEG): container finished" podID="e6a28e60-a4ea-42bc-baaf-d90f095194db" containerID="92c72a89a107928ed4e894fdab18396694847871d111ea13884b19906757f108" exitCode=0 Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.364948 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zwsx" event={"ID":"e6a28e60-a4ea-42bc-baaf-d90f095194db","Type":"ContainerDied","Data":"92c72a89a107928ed4e894fdab18396694847871d111ea13884b19906757f108"} Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.364962 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.365059 4713 scope.go:117] "RemoveContainer" containerID="92c72a89a107928ed4e894fdab18396694847871d111ea13884b19906757f108" Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.366554 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zwsx" event={"ID":"e6a28e60-a4ea-42bc-baaf-d90f095194db","Type":"ContainerDied","Data":"4f79f8d113b33a8ed31c712e6bc24fadf44173317f45c47a3aecbb3f986bd86c"} Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.379924 4713 scope.go:117] "RemoveContainer" containerID="86a8568ad4d8d26eb934bfa1de549b2580203db400351de12095645eb85258ce" Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.393919 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9zwsx"] Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.395899 4713 scope.go:117] "RemoveContainer" containerID="7f9113f3ab2d0e883be9558722f019c38fe9e8388dad6e267f07a8b9c81b4957" Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.403156 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9zwsx"] Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.415382 4713 scope.go:117] "RemoveContainer" containerID="92c72a89a107928ed4e894fdab18396694847871d111ea13884b19906757f108" Mar 08 00:22:45 crc kubenswrapper[4713]: E0308 00:22:45.416031 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92c72a89a107928ed4e894fdab18396694847871d111ea13884b19906757f108\": container with ID starting with 92c72a89a107928ed4e894fdab18396694847871d111ea13884b19906757f108 not found: ID does not exist" containerID="92c72a89a107928ed4e894fdab18396694847871d111ea13884b19906757f108" Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.416083 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92c72a89a107928ed4e894fdab18396694847871d111ea13884b19906757f108"} err="failed to get container status \"92c72a89a107928ed4e894fdab18396694847871d111ea13884b19906757f108\": rpc error: code = NotFound desc = could not find container \"92c72a89a107928ed4e894fdab18396694847871d111ea13884b19906757f108\": container with ID starting with 92c72a89a107928ed4e894fdab18396694847871d111ea13884b19906757f108 not found: ID does not exist" Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.416114 4713 scope.go:117] "RemoveContainer" containerID="86a8568ad4d8d26eb934bfa1de549b2580203db400351de12095645eb85258ce" Mar 08 00:22:45 crc kubenswrapper[4713]: E0308 00:22:45.416406 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86a8568ad4d8d26eb934bfa1de549b2580203db400351de12095645eb85258ce\": container with ID starting with 86a8568ad4d8d26eb934bfa1de549b2580203db400351de12095645eb85258ce not found: ID does not exist" containerID="86a8568ad4d8d26eb934bfa1de549b2580203db400351de12095645eb85258ce" Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.416440 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86a8568ad4d8d26eb934bfa1de549b2580203db400351de12095645eb85258ce"} err="failed to get container status \"86a8568ad4d8d26eb934bfa1de549b2580203db400351de12095645eb85258ce\": rpc error: code = NotFound desc = could not find container \"86a8568ad4d8d26eb934bfa1de549b2580203db400351de12095645eb85258ce\": container with ID starting with 86a8568ad4d8d26eb934bfa1de549b2580203db400351de12095645eb85258ce not found: ID does not exist" Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.416458 4713 scope.go:117] "RemoveContainer" containerID="7f9113f3ab2d0e883be9558722f019c38fe9e8388dad6e267f07a8b9c81b4957" Mar 08 00:22:45 crc kubenswrapper[4713]: E0308 00:22:45.416695 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f9113f3ab2d0e883be9558722f019c38fe9e8388dad6e267f07a8b9c81b4957\": container with ID starting with 7f9113f3ab2d0e883be9558722f019c38fe9e8388dad6e267f07a8b9c81b4957 not found: ID does not exist" containerID="7f9113f3ab2d0e883be9558722f019c38fe9e8388dad6e267f07a8b9c81b4957" Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.416720 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f9113f3ab2d0e883be9558722f019c38fe9e8388dad6e267f07a8b9c81b4957"} err="failed to get container status \"7f9113f3ab2d0e883be9558722f019c38fe9e8388dad6e267f07a8b9c81b4957\": rpc error: code = NotFound desc = could not find container \"7f9113f3ab2d0e883be9558722f019c38fe9e8388dad6e267f07a8b9c81b4957\": container with ID starting with 7f9113f3ab2d0e883be9558722f019c38fe9e8388dad6e267f07a8b9c81b4957 not found: ID does not exist" Mar 08 00:22:46 crc kubenswrapper[4713]: I0308 00:22:46.549669 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6a28e60-a4ea-42bc-baaf-d90f095194db" path="/var/lib/kubelet/pods/e6a28e60-a4ea-42bc-baaf-d90f095194db/volumes" Mar 08 00:23:04 crc kubenswrapper[4713]: I0308 00:23:04.500927 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:23:04 crc kubenswrapper[4713]: I0308 00:23:04.501418 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:23:09 crc kubenswrapper[4713]: I0308 00:23:09.512755 4713 generic.go:334] "Generic (PLEG): container finished" podID="1fc148b4-f954-4ef0-8c15-bbff85220029" containerID="ee6703f14aab020c6c6eebf428313e07ac09472749abcb07ec3ca3caf3e5ca7f" exitCode=0 Mar 08 00:23:09 crc kubenswrapper[4713]: I0308 00:23:09.512850 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"1fc148b4-f954-4ef0-8c15-bbff85220029","Type":"ContainerDied","Data":"ee6703f14aab020c6c6eebf428313e07ac09472749abcb07ec3ca3caf3e5ca7f"} Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.761187 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.838548 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-ca-bundles\") pod \"1fc148b4-f954-4ef0-8c15-bbff85220029\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.838613 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/1fc148b4-f954-4ef0-8c15-bbff85220029-builder-dockercfg-ptp88-push\") pod \"1fc148b4-f954-4ef0-8c15-bbff85220029\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.838661 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-system-configs\") pod \"1fc148b4-f954-4ef0-8c15-bbff85220029\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.838693 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-build-blob-cache\") pod \"1fc148b4-f954-4ef0-8c15-bbff85220029\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.838710 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-buildworkdir\") pod \"1fc148b4-f954-4ef0-8c15-bbff85220029\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.838734 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1fc148b4-f954-4ef0-8c15-bbff85220029-buildcachedir\") pod \"1fc148b4-f954-4ef0-8c15-bbff85220029\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.838754 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dclxd\" (UniqueName: \"kubernetes.io/projected/1fc148b4-f954-4ef0-8c15-bbff85220029-kube-api-access-dclxd\") pod \"1fc148b4-f954-4ef0-8c15-bbff85220029\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.838785 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-container-storage-root\") pod \"1fc148b4-f954-4ef0-8c15-bbff85220029\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.838817 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-proxy-ca-bundles\") pod \"1fc148b4-f954-4ef0-8c15-bbff85220029\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.838865 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1fc148b4-f954-4ef0-8c15-bbff85220029-node-pullsecrets\") pod \"1fc148b4-f954-4ef0-8c15-bbff85220029\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.838886 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-container-storage-run\") pod \"1fc148b4-f954-4ef0-8c15-bbff85220029\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.838910 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/1fc148b4-f954-4ef0-8c15-bbff85220029-builder-dockercfg-ptp88-pull\") pod \"1fc148b4-f954-4ef0-8c15-bbff85220029\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.839682 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "1fc148b4-f954-4ef0-8c15-bbff85220029" (UID: "1fc148b4-f954-4ef0-8c15-bbff85220029"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.839791 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "1fc148b4-f954-4ef0-8c15-bbff85220029" (UID: "1fc148b4-f954-4ef0-8c15-bbff85220029"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.839965 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1fc148b4-f954-4ef0-8c15-bbff85220029-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "1fc148b4-f954-4ef0-8c15-bbff85220029" (UID: "1fc148b4-f954-4ef0-8c15-bbff85220029"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.839950 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1fc148b4-f954-4ef0-8c15-bbff85220029-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "1fc148b4-f954-4ef0-8c15-bbff85220029" (UID: "1fc148b4-f954-4ef0-8c15-bbff85220029"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.840378 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "1fc148b4-f954-4ef0-8c15-bbff85220029" (UID: "1fc148b4-f954-4ef0-8c15-bbff85220029"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.840853 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "1fc148b4-f954-4ef0-8c15-bbff85220029" (UID: "1fc148b4-f954-4ef0-8c15-bbff85220029"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.845995 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fc148b4-f954-4ef0-8c15-bbff85220029-builder-dockercfg-ptp88-push" (OuterVolumeSpecName: "builder-dockercfg-ptp88-push") pod "1fc148b4-f954-4ef0-8c15-bbff85220029" (UID: "1fc148b4-f954-4ef0-8c15-bbff85220029"). InnerVolumeSpecName "builder-dockercfg-ptp88-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.846024 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fc148b4-f954-4ef0-8c15-bbff85220029-builder-dockercfg-ptp88-pull" (OuterVolumeSpecName: "builder-dockercfg-ptp88-pull") pod "1fc148b4-f954-4ef0-8c15-bbff85220029" (UID: "1fc148b4-f954-4ef0-8c15-bbff85220029"). InnerVolumeSpecName "builder-dockercfg-ptp88-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.846049 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fc148b4-f954-4ef0-8c15-bbff85220029-kube-api-access-dclxd" (OuterVolumeSpecName: "kube-api-access-dclxd") pod "1fc148b4-f954-4ef0-8c15-bbff85220029" (UID: "1fc148b4-f954-4ef0-8c15-bbff85220029"). InnerVolumeSpecName "kube-api-access-dclxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.875260 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "1fc148b4-f954-4ef0-8c15-bbff85220029" (UID: "1fc148b4-f954-4ef0-8c15-bbff85220029"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.940915 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.941199 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/1fc148b4-f954-4ef0-8c15-bbff85220029-builder-dockercfg-ptp88-pull\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.941329 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.941389 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/1fc148b4-f954-4ef0-8c15-bbff85220029-builder-dockercfg-ptp88-push\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.941447 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.941502 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.941555 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1fc148b4-f954-4ef0-8c15-bbff85220029-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.941612 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dclxd\" (UniqueName: \"kubernetes.io/projected/1fc148b4-f954-4ef0-8c15-bbff85220029-kube-api-access-dclxd\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.941666 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.941745 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1fc148b4-f954-4ef0-8c15-bbff85220029-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:11 crc kubenswrapper[4713]: I0308 00:23:11.019233 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "1fc148b4-f954-4ef0-8c15-bbff85220029" (UID: "1fc148b4-f954-4ef0-8c15-bbff85220029"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:23:11 crc kubenswrapper[4713]: I0308 00:23:11.042700 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:11 crc kubenswrapper[4713]: I0308 00:23:11.528041 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"1fc148b4-f954-4ef0-8c15-bbff85220029","Type":"ContainerDied","Data":"66baa3590517de49b5509c8015457716863174fefd4cceac80014e6ff5386a9e"} Mar 08 00:23:11 crc kubenswrapper[4713]: I0308 00:23:11.528082 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66baa3590517de49b5509c8015457716863174fefd4cceac80014e6ff5386a9e" Mar 08 00:23:11 crc kubenswrapper[4713]: I0308 00:23:11.528144 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:23:12 crc kubenswrapper[4713]: I0308 00:23:12.595815 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "1fc148b4-f954-4ef0-8c15-bbff85220029" (UID: "1fc148b4-f954-4ef0-8c15-bbff85220029"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:23:12 crc kubenswrapper[4713]: I0308 00:23:12.663805 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.088114 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 08 00:23:15 crc kubenswrapper[4713]: E0308 00:23:15.088633 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6a28e60-a4ea-42bc-baaf-d90f095194db" containerName="extract-utilities" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.088645 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6a28e60-a4ea-42bc-baaf-d90f095194db" containerName="extract-utilities" Mar 08 00:23:15 crc kubenswrapper[4713]: E0308 00:23:15.088653 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6a28e60-a4ea-42bc-baaf-d90f095194db" containerName="registry-server" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.088659 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6a28e60-a4ea-42bc-baaf-d90f095194db" containerName="registry-server" Mar 08 00:23:15 crc kubenswrapper[4713]: E0308 00:23:15.088672 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fc148b4-f954-4ef0-8c15-bbff85220029" containerName="manage-dockerfile" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.088678 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc148b4-f954-4ef0-8c15-bbff85220029" containerName="manage-dockerfile" Mar 08 00:23:15 crc kubenswrapper[4713]: E0308 00:23:15.088694 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6a28e60-a4ea-42bc-baaf-d90f095194db" containerName="extract-content" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.088699 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6a28e60-a4ea-42bc-baaf-d90f095194db" containerName="extract-content" Mar 08 00:23:15 crc kubenswrapper[4713]: E0308 00:23:15.088707 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fc148b4-f954-4ef0-8c15-bbff85220029" containerName="git-clone" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.088713 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc148b4-f954-4ef0-8c15-bbff85220029" containerName="git-clone" Mar 08 00:23:15 crc kubenswrapper[4713]: E0308 00:23:15.088720 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fc148b4-f954-4ef0-8c15-bbff85220029" containerName="docker-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.088727 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc148b4-f954-4ef0-8c15-bbff85220029" containerName="docker-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.088855 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6a28e60-a4ea-42bc-baaf-d90f095194db" containerName="registry-server" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.088868 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fc148b4-f954-4ef0-8c15-bbff85220029" containerName="docker-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.089489 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.091869 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-global-ca" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.092094 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-ptp88" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.092382 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-sys-config" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.092468 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-ca" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.106477 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.197406 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.197607 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/28cea654-fd65-41d0-a3bf-74641ad0990c-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.197775 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.197872 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/28cea654-fd65-41d0-a3bf-74641ad0990c-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.197926 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.198038 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.198107 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.198200 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/28cea654-fd65-41d0-a3bf-74641ad0990c-builder-dockercfg-ptp88-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.198305 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knkjm\" (UniqueName: \"kubernetes.io/projected/28cea654-fd65-41d0-a3bf-74641ad0990c-kube-api-access-knkjm\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.198353 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/28cea654-fd65-41d0-a3bf-74641ad0990c-builder-dockercfg-ptp88-push\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.198380 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.198411 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.300085 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.300220 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.300258 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/28cea654-fd65-41d0-a3bf-74641ad0990c-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.300282 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.300312 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/28cea654-fd65-41d0-a3bf-74641ad0990c-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.300345 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.300382 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.300408 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.300427 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/28cea654-fd65-41d0-a3bf-74641ad0990c-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.300467 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/28cea654-fd65-41d0-a3bf-74641ad0990c-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.300445 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/28cea654-fd65-41d0-a3bf-74641ad0990c-builder-dockercfg-ptp88-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.300674 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knkjm\" (UniqueName: \"kubernetes.io/projected/28cea654-fd65-41d0-a3bf-74641ad0990c-kube-api-access-knkjm\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.300752 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/28cea654-fd65-41d0-a3bf-74641ad0990c-builder-dockercfg-ptp88-push\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.300769 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.300781 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.301064 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.301186 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.301238 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.301480 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.301688 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.302293 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.309015 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/28cea654-fd65-41d0-a3bf-74641ad0990c-builder-dockercfg-ptp88-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.309045 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/28cea654-fd65-41d0-a3bf-74641ad0990c-builder-dockercfg-ptp88-push\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.321307 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knkjm\" (UniqueName: \"kubernetes.io/projected/28cea654-fd65-41d0-a3bf-74641ad0990c-kube-api-access-knkjm\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.405359 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:16 crc kubenswrapper[4713]: I0308 00:23:16.555166 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 08 00:23:16 crc kubenswrapper[4713]: I0308 00:23:16.573915 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"28cea654-fd65-41d0-a3bf-74641ad0990c","Type":"ContainerStarted","Data":"786d94fda50c47e27ddd447590d235cdd4682da4924ec66b3fd625d8e492e3f4"} Mar 08 00:23:17 crc kubenswrapper[4713]: I0308 00:23:17.583367 4713 generic.go:334] "Generic (PLEG): container finished" podID="28cea654-fd65-41d0-a3bf-74641ad0990c" containerID="94cfdb707f1c9f93a207c75667296b22830471b2b16f9d2a90c008cbb7c58001" exitCode=0 Mar 08 00:23:17 crc kubenswrapper[4713]: I0308 00:23:17.583491 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"28cea654-fd65-41d0-a3bf-74641ad0990c","Type":"ContainerDied","Data":"94cfdb707f1c9f93a207c75667296b22830471b2b16f9d2a90c008cbb7c58001"} Mar 08 00:23:18 crc kubenswrapper[4713]: I0308 00:23:18.591897 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"28cea654-fd65-41d0-a3bf-74641ad0990c","Type":"ContainerStarted","Data":"22472e2e88e22e821408b69b900850c0f4cfe857988a1ba718b2fad073f95341"} Mar 08 00:23:18 crc kubenswrapper[4713]: I0308 00:23:18.616009 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-1-build" podStartSLOduration=3.615991084 podStartE2EDuration="3.615991084s" podCreationTimestamp="2026-03-08 00:23:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:23:18.610812406 +0000 UTC m=+1052.730444659" watchObservedRunningTime="2026-03-08 00:23:18.615991084 +0000 UTC m=+1052.735623317" Mar 08 00:23:25 crc kubenswrapper[4713]: I0308 00:23:25.514391 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 08 00:23:25 crc kubenswrapper[4713]: I0308 00:23:25.515496 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/smart-gateway-operator-1-build" podUID="28cea654-fd65-41d0-a3bf-74641ad0990c" containerName="docker-build" containerID="cri-o://22472e2e88e22e821408b69b900850c0f4cfe857988a1ba718b2fad073f95341" gracePeriod=30 Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.183469 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.185674 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.187452 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-ca" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.188344 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-global-ca" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.189952 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-sys-config" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.213752 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.274642 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/036ba45b-c97e-4ac4-a537-373dfa81f0de-builder-dockercfg-ptp88-push\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.274726 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.274749 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fftds\" (UniqueName: \"kubernetes.io/projected/036ba45b-c97e-4ac4-a537-373dfa81f0de-kube-api-access-fftds\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.274768 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.274932 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.274973 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/036ba45b-c97e-4ac4-a537-373dfa81f0de-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.275028 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.275147 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.275220 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.275277 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/036ba45b-c97e-4ac4-a537-373dfa81f0de-builder-dockercfg-ptp88-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.275356 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.275408 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/036ba45b-c97e-4ac4-a537-373dfa81f0de-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.376458 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.376499 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fftds\" (UniqueName: \"kubernetes.io/projected/036ba45b-c97e-4ac4-a537-373dfa81f0de-kube-api-access-fftds\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.376522 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.376543 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.376560 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/036ba45b-c97e-4ac4-a537-373dfa81f0de-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.376583 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.376603 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.376630 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.376653 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/036ba45b-c97e-4ac4-a537-373dfa81f0de-builder-dockercfg-ptp88-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.376672 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.376694 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/036ba45b-c97e-4ac4-a537-373dfa81f0de-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.376718 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/036ba45b-c97e-4ac4-a537-373dfa81f0de-builder-dockercfg-ptp88-push\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.377172 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.377613 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.377847 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.377975 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.378056 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/036ba45b-c97e-4ac4-a537-373dfa81f0de-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.378084 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/036ba45b-c97e-4ac4-a537-373dfa81f0de-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.378101 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.378188 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.378723 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.390746 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/036ba45b-c97e-4ac4-a537-373dfa81f0de-builder-dockercfg-ptp88-push\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.391192 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/036ba45b-c97e-4ac4-a537-373dfa81f0de-builder-dockercfg-ptp88-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.398492 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fftds\" (UniqueName: \"kubernetes.io/projected/036ba45b-c97e-4ac4-a537-373dfa81f0de-kube-api-access-fftds\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.502189 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.903915 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 08 00:23:27 crc kubenswrapper[4713]: W0308 00:23:27.914255 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod036ba45b_c97e_4ac4_a537_373dfa81f0de.slice/crio-99e67191fc98c3ca6b2a46bb30dbdf3d717dba25a505f668389d0a3c46fc65b7 WatchSource:0}: Error finding container 99e67191fc98c3ca6b2a46bb30dbdf3d717dba25a505f668389d0a3c46fc65b7: Status 404 returned error can't find the container with id 99e67191fc98c3ca6b2a46bb30dbdf3d717dba25a505f668389d0a3c46fc65b7 Mar 08 00:23:28 crc kubenswrapper[4713]: I0308 00:23:28.659710 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"036ba45b-c97e-4ac4-a537-373dfa81f0de","Type":"ContainerStarted","Data":"99e67191fc98c3ca6b2a46bb30dbdf3d717dba25a505f668389d0a3c46fc65b7"} Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.618669 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_28cea654-fd65-41d0-a3bf-74641ad0990c/docker-build/0.log" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.619268 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.667443 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"036ba45b-c97e-4ac4-a537-373dfa81f0de","Type":"ContainerStarted","Data":"70059d4912f6673006f3786721d68ea839c745cba2342d836a8a02bc5cd3016a"} Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.668752 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_28cea654-fd65-41d0-a3bf-74641ad0990c/docker-build/0.log" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.669134 4713 generic.go:334] "Generic (PLEG): container finished" podID="28cea654-fd65-41d0-a3bf-74641ad0990c" containerID="22472e2e88e22e821408b69b900850c0f4cfe857988a1ba718b2fad073f95341" exitCode=1 Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.669163 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"28cea654-fd65-41d0-a3bf-74641ad0990c","Type":"ContainerDied","Data":"22472e2e88e22e821408b69b900850c0f4cfe857988a1ba718b2fad073f95341"} Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.669187 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"28cea654-fd65-41d0-a3bf-74641ad0990c","Type":"ContainerDied","Data":"786d94fda50c47e27ddd447590d235cdd4682da4924ec66b3fd625d8e492e3f4"} Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.669208 4713 scope.go:117] "RemoveContainer" containerID="22472e2e88e22e821408b69b900850c0f4cfe857988a1ba718b2fad073f95341" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.669166 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.707603 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/28cea654-fd65-41d0-a3bf-74641ad0990c-builder-dockercfg-ptp88-push\") pod \"28cea654-fd65-41d0-a3bf-74641ad0990c\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.707754 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-buildworkdir\") pod \"28cea654-fd65-41d0-a3bf-74641ad0990c\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.707782 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-container-storage-run\") pod \"28cea654-fd65-41d0-a3bf-74641ad0990c\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.707879 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-system-configs\") pod \"28cea654-fd65-41d0-a3bf-74641ad0990c\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.707914 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knkjm\" (UniqueName: \"kubernetes.io/projected/28cea654-fd65-41d0-a3bf-74641ad0990c-kube-api-access-knkjm\") pod \"28cea654-fd65-41d0-a3bf-74641ad0990c\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.707969 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/28cea654-fd65-41d0-a3bf-74641ad0990c-buildcachedir\") pod \"28cea654-fd65-41d0-a3bf-74641ad0990c\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.708008 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/28cea654-fd65-41d0-a3bf-74641ad0990c-node-pullsecrets\") pod \"28cea654-fd65-41d0-a3bf-74641ad0990c\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.708038 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-container-storage-root\") pod \"28cea654-fd65-41d0-a3bf-74641ad0990c\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.708071 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-ca-bundles\") pod \"28cea654-fd65-41d0-a3bf-74641ad0990c\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.708091 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/28cea654-fd65-41d0-a3bf-74641ad0990c-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "28cea654-fd65-41d0-a3bf-74641ad0990c" (UID: "28cea654-fd65-41d0-a3bf-74641ad0990c"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.708097 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/28cea654-fd65-41d0-a3bf-74641ad0990c-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "28cea654-fd65-41d0-a3bf-74641ad0990c" (UID: "28cea654-fd65-41d0-a3bf-74641ad0990c"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.708731 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "28cea654-fd65-41d0-a3bf-74641ad0990c" (UID: "28cea654-fd65-41d0-a3bf-74641ad0990c"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.708764 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "28cea654-fd65-41d0-a3bf-74641ad0990c" (UID: "28cea654-fd65-41d0-a3bf-74641ad0990c"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.708994 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "28cea654-fd65-41d0-a3bf-74641ad0990c" (UID: "28cea654-fd65-41d0-a3bf-74641ad0990c"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.708157 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-proxy-ca-bundles\") pod \"28cea654-fd65-41d0-a3bf-74641ad0990c\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.709148 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-build-blob-cache\") pod \"28cea654-fd65-41d0-a3bf-74641ad0990c\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.709184 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/28cea654-fd65-41d0-a3bf-74641ad0990c-builder-dockercfg-ptp88-pull\") pod \"28cea654-fd65-41d0-a3bf-74641ad0990c\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.709073 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "28cea654-fd65-41d0-a3bf-74641ad0990c" (UID: "28cea654-fd65-41d0-a3bf-74641ad0990c"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.710006 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.710033 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.710045 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/28cea654-fd65-41d0-a3bf-74641ad0990c-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.710054 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/28cea654-fd65-41d0-a3bf-74641ad0990c-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.710066 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.710076 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.711369 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "28cea654-fd65-41d0-a3bf-74641ad0990c" (UID: "28cea654-fd65-41d0-a3bf-74641ad0990c"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.713401 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28cea654-fd65-41d0-a3bf-74641ad0990c-builder-dockercfg-ptp88-push" (OuterVolumeSpecName: "builder-dockercfg-ptp88-push") pod "28cea654-fd65-41d0-a3bf-74641ad0990c" (UID: "28cea654-fd65-41d0-a3bf-74641ad0990c"). InnerVolumeSpecName "builder-dockercfg-ptp88-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.713461 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28cea654-fd65-41d0-a3bf-74641ad0990c-builder-dockercfg-ptp88-pull" (OuterVolumeSpecName: "builder-dockercfg-ptp88-pull") pod "28cea654-fd65-41d0-a3bf-74641ad0990c" (UID: "28cea654-fd65-41d0-a3bf-74641ad0990c"). InnerVolumeSpecName "builder-dockercfg-ptp88-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.714096 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28cea654-fd65-41d0-a3bf-74641ad0990c-kube-api-access-knkjm" (OuterVolumeSpecName: "kube-api-access-knkjm") pod "28cea654-fd65-41d0-a3bf-74641ad0990c" (UID: "28cea654-fd65-41d0-a3bf-74641ad0990c"). InnerVolumeSpecName "kube-api-access-knkjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.751535 4713 scope.go:117] "RemoveContainer" containerID="94cfdb707f1c9f93a207c75667296b22830471b2b16f9d2a90c008cbb7c58001" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.783081 4713 scope.go:117] "RemoveContainer" containerID="22472e2e88e22e821408b69b900850c0f4cfe857988a1ba718b2fad073f95341" Mar 08 00:23:29 crc kubenswrapper[4713]: E0308 00:23:29.783618 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22472e2e88e22e821408b69b900850c0f4cfe857988a1ba718b2fad073f95341\": container with ID starting with 22472e2e88e22e821408b69b900850c0f4cfe857988a1ba718b2fad073f95341 not found: ID does not exist" containerID="22472e2e88e22e821408b69b900850c0f4cfe857988a1ba718b2fad073f95341" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.783654 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22472e2e88e22e821408b69b900850c0f4cfe857988a1ba718b2fad073f95341"} err="failed to get container status \"22472e2e88e22e821408b69b900850c0f4cfe857988a1ba718b2fad073f95341\": rpc error: code = NotFound desc = could not find container \"22472e2e88e22e821408b69b900850c0f4cfe857988a1ba718b2fad073f95341\": container with ID starting with 22472e2e88e22e821408b69b900850c0f4cfe857988a1ba718b2fad073f95341 not found: ID does not exist" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.783675 4713 scope.go:117] "RemoveContainer" containerID="94cfdb707f1c9f93a207c75667296b22830471b2b16f9d2a90c008cbb7c58001" Mar 08 00:23:29 crc kubenswrapper[4713]: E0308 00:23:29.784843 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94cfdb707f1c9f93a207c75667296b22830471b2b16f9d2a90c008cbb7c58001\": container with ID starting with 94cfdb707f1c9f93a207c75667296b22830471b2b16f9d2a90c008cbb7c58001 not found: ID does not exist" containerID="94cfdb707f1c9f93a207c75667296b22830471b2b16f9d2a90c008cbb7c58001" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.784920 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94cfdb707f1c9f93a207c75667296b22830471b2b16f9d2a90c008cbb7c58001"} err="failed to get container status \"94cfdb707f1c9f93a207c75667296b22830471b2b16f9d2a90c008cbb7c58001\": rpc error: code = NotFound desc = could not find container \"94cfdb707f1c9f93a207c75667296b22830471b2b16f9d2a90c008cbb7c58001\": container with ID starting with 94cfdb707f1c9f93a207c75667296b22830471b2b16f9d2a90c008cbb7c58001 not found: ID does not exist" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.811425 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/28cea654-fd65-41d0-a3bf-74641ad0990c-builder-dockercfg-ptp88-pull\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.811461 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/28cea654-fd65-41d0-a3bf-74641ad0990c-builder-dockercfg-ptp88-push\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.811472 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.811485 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knkjm\" (UniqueName: \"kubernetes.io/projected/28cea654-fd65-41d0-a3bf-74641ad0990c-kube-api-access-knkjm\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.957542 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "28cea654-fd65-41d0-a3bf-74641ad0990c" (UID: "28cea654-fd65-41d0-a3bf-74641ad0990c"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.994125 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "28cea654-fd65-41d0-a3bf-74641ad0990c" (UID: "28cea654-fd65-41d0-a3bf-74641ad0990c"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:23:30 crc kubenswrapper[4713]: I0308 00:23:30.014623 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:30 crc kubenswrapper[4713]: I0308 00:23:30.014666 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:30 crc kubenswrapper[4713]: I0308 00:23:30.304802 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 08 00:23:30 crc kubenswrapper[4713]: I0308 00:23:30.310829 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 08 00:23:30 crc kubenswrapper[4713]: I0308 00:23:30.549587 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28cea654-fd65-41d0-a3bf-74641ad0990c" path="/var/lib/kubelet/pods/28cea654-fd65-41d0-a3bf-74641ad0990c/volumes" Mar 08 00:23:30 crc kubenswrapper[4713]: I0308 00:23:30.678256 4713 generic.go:334] "Generic (PLEG): container finished" podID="036ba45b-c97e-4ac4-a537-373dfa81f0de" containerID="70059d4912f6673006f3786721d68ea839c745cba2342d836a8a02bc5cd3016a" exitCode=0 Mar 08 00:23:30 crc kubenswrapper[4713]: I0308 00:23:30.678317 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"036ba45b-c97e-4ac4-a537-373dfa81f0de","Type":"ContainerDied","Data":"70059d4912f6673006f3786721d68ea839c745cba2342d836a8a02bc5cd3016a"} Mar 08 00:23:31 crc kubenswrapper[4713]: I0308 00:23:31.687408 4713 generic.go:334] "Generic (PLEG): container finished" podID="036ba45b-c97e-4ac4-a537-373dfa81f0de" containerID="df57be6898528c792d3da245f48f36ebd1e922776e89da3fc00040bd8ac76e19" exitCode=0 Mar 08 00:23:31 crc kubenswrapper[4713]: I0308 00:23:31.687453 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"036ba45b-c97e-4ac4-a537-373dfa81f0de","Type":"ContainerDied","Data":"df57be6898528c792d3da245f48f36ebd1e922776e89da3fc00040bd8ac76e19"} Mar 08 00:23:31 crc kubenswrapper[4713]: I0308 00:23:31.730657 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-2-build_036ba45b-c97e-4ac4-a537-373dfa81f0de/manage-dockerfile/0.log" Mar 08 00:23:32 crc kubenswrapper[4713]: I0308 00:23:32.698064 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"036ba45b-c97e-4ac4-a537-373dfa81f0de","Type":"ContainerStarted","Data":"e91504847d35f8027e57aadd536ccdea7215ff603a6a8dc70b2cb358b3c880ab"} Mar 08 00:23:32 crc kubenswrapper[4713]: I0308 00:23:32.741945 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-2-build" podStartSLOduration=5.741921315 podStartE2EDuration="5.741921315s" podCreationTimestamp="2026-03-08 00:23:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:23:32.735252257 +0000 UTC m=+1066.854884500" watchObservedRunningTime="2026-03-08 00:23:32.741921315 +0000 UTC m=+1066.861553548" Mar 08 00:23:34 crc kubenswrapper[4713]: I0308 00:23:34.501171 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:23:34 crc kubenswrapper[4713]: I0308 00:23:34.501775 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:23:34 crc kubenswrapper[4713]: I0308 00:23:34.501879 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:23:34 crc kubenswrapper[4713]: I0308 00:23:34.502944 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c05ee6e5a19168a6d6242d209054a09db1bc72634110e6c102d8134908c2acc0"} pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 00:23:34 crc kubenswrapper[4713]: I0308 00:23:34.503027 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" containerID="cri-o://c05ee6e5a19168a6d6242d209054a09db1bc72634110e6c102d8134908c2acc0" gracePeriod=600 Mar 08 00:23:34 crc kubenswrapper[4713]: I0308 00:23:34.713918 4713 generic.go:334] "Generic (PLEG): container finished" podID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerID="c05ee6e5a19168a6d6242d209054a09db1bc72634110e6c102d8134908c2acc0" exitCode=0 Mar 08 00:23:34 crc kubenswrapper[4713]: I0308 00:23:34.713956 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerDied","Data":"c05ee6e5a19168a6d6242d209054a09db1bc72634110e6c102d8134908c2acc0"} Mar 08 00:23:34 crc kubenswrapper[4713]: I0308 00:23:34.713986 4713 scope.go:117] "RemoveContainer" containerID="3f58d2453dfb0789e4b6de1707b22e49490c850b97fdf881933aaed3e3ea5cb4" Mar 08 00:23:35 crc kubenswrapper[4713]: I0308 00:23:35.722124 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerStarted","Data":"c9719f0bfb278b285d17679470509ae6172a8ecfd762a13c6a85c14fdaf89f7f"} Mar 08 00:24:00 crc kubenswrapper[4713]: I0308 00:24:00.132361 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548824-mrbjn"] Mar 08 00:24:00 crc kubenswrapper[4713]: E0308 00:24:00.133098 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28cea654-fd65-41d0-a3bf-74641ad0990c" containerName="manage-dockerfile" Mar 08 00:24:00 crc kubenswrapper[4713]: I0308 00:24:00.133110 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="28cea654-fd65-41d0-a3bf-74641ad0990c" containerName="manage-dockerfile" Mar 08 00:24:00 crc kubenswrapper[4713]: E0308 00:24:00.133120 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28cea654-fd65-41d0-a3bf-74641ad0990c" containerName="docker-build" Mar 08 00:24:00 crc kubenswrapper[4713]: I0308 00:24:00.133126 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="28cea654-fd65-41d0-a3bf-74641ad0990c" containerName="docker-build" Mar 08 00:24:00 crc kubenswrapper[4713]: I0308 00:24:00.133235 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="28cea654-fd65-41d0-a3bf-74641ad0990c" containerName="docker-build" Mar 08 00:24:00 crc kubenswrapper[4713]: I0308 00:24:00.133606 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548824-mrbjn" Mar 08 00:24:00 crc kubenswrapper[4713]: I0308 00:24:00.136612 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:24:00 crc kubenswrapper[4713]: I0308 00:24:00.136683 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:24:00 crc kubenswrapper[4713]: I0308 00:24:00.136798 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:24:00 crc kubenswrapper[4713]: I0308 00:24:00.140404 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548824-mrbjn"] Mar 08 00:24:00 crc kubenswrapper[4713]: I0308 00:24:00.213654 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94wj4\" (UniqueName: \"kubernetes.io/projected/42829204-3911-4926-bcab-0e8f7b731986-kube-api-access-94wj4\") pod \"auto-csr-approver-29548824-mrbjn\" (UID: \"42829204-3911-4926-bcab-0e8f7b731986\") " pod="openshift-infra/auto-csr-approver-29548824-mrbjn" Mar 08 00:24:00 crc kubenswrapper[4713]: I0308 00:24:00.315041 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94wj4\" (UniqueName: \"kubernetes.io/projected/42829204-3911-4926-bcab-0e8f7b731986-kube-api-access-94wj4\") pod \"auto-csr-approver-29548824-mrbjn\" (UID: \"42829204-3911-4926-bcab-0e8f7b731986\") " pod="openshift-infra/auto-csr-approver-29548824-mrbjn" Mar 08 00:24:00 crc kubenswrapper[4713]: I0308 00:24:00.333739 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94wj4\" (UniqueName: \"kubernetes.io/projected/42829204-3911-4926-bcab-0e8f7b731986-kube-api-access-94wj4\") pod \"auto-csr-approver-29548824-mrbjn\" (UID: \"42829204-3911-4926-bcab-0e8f7b731986\") " pod="openshift-infra/auto-csr-approver-29548824-mrbjn" Mar 08 00:24:00 crc kubenswrapper[4713]: I0308 00:24:00.461725 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548824-mrbjn" Mar 08 00:24:00 crc kubenswrapper[4713]: I0308 00:24:00.861175 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548824-mrbjn"] Mar 08 00:24:00 crc kubenswrapper[4713]: I0308 00:24:00.883444 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548824-mrbjn" event={"ID":"42829204-3911-4926-bcab-0e8f7b731986","Type":"ContainerStarted","Data":"3d892d14bb1c80986170cb8cd73af5739315d3b18f77257ddc15c638af4a621e"} Mar 08 00:24:03 crc kubenswrapper[4713]: I0308 00:24:03.903492 4713 generic.go:334] "Generic (PLEG): container finished" podID="42829204-3911-4926-bcab-0e8f7b731986" containerID="5194adfd055d923428c5bad5d8993dba160fbbc540dca7c2cc8ef69daad1dbf4" exitCode=0 Mar 08 00:24:03 crc kubenswrapper[4713]: I0308 00:24:03.903568 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548824-mrbjn" event={"ID":"42829204-3911-4926-bcab-0e8f7b731986","Type":"ContainerDied","Data":"5194adfd055d923428c5bad5d8993dba160fbbc540dca7c2cc8ef69daad1dbf4"} Mar 08 00:24:05 crc kubenswrapper[4713]: I0308 00:24:05.140355 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548824-mrbjn" Mar 08 00:24:05 crc kubenswrapper[4713]: I0308 00:24:05.277388 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94wj4\" (UniqueName: \"kubernetes.io/projected/42829204-3911-4926-bcab-0e8f7b731986-kube-api-access-94wj4\") pod \"42829204-3911-4926-bcab-0e8f7b731986\" (UID: \"42829204-3911-4926-bcab-0e8f7b731986\") " Mar 08 00:24:05 crc kubenswrapper[4713]: I0308 00:24:05.284065 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42829204-3911-4926-bcab-0e8f7b731986-kube-api-access-94wj4" (OuterVolumeSpecName: "kube-api-access-94wj4") pod "42829204-3911-4926-bcab-0e8f7b731986" (UID: "42829204-3911-4926-bcab-0e8f7b731986"). InnerVolumeSpecName "kube-api-access-94wj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:24:05 crc kubenswrapper[4713]: I0308 00:24:05.378790 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94wj4\" (UniqueName: \"kubernetes.io/projected/42829204-3911-4926-bcab-0e8f7b731986-kube-api-access-94wj4\") on node \"crc\" DevicePath \"\"" Mar 08 00:24:05 crc kubenswrapper[4713]: I0308 00:24:05.922151 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548824-mrbjn" event={"ID":"42829204-3911-4926-bcab-0e8f7b731986","Type":"ContainerDied","Data":"3d892d14bb1c80986170cb8cd73af5739315d3b18f77257ddc15c638af4a621e"} Mar 08 00:24:05 crc kubenswrapper[4713]: I0308 00:24:05.922477 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d892d14bb1c80986170cb8cd73af5739315d3b18f77257ddc15c638af4a621e" Mar 08 00:24:05 crc kubenswrapper[4713]: I0308 00:24:05.922168 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548824-mrbjn" Mar 08 00:24:06 crc kubenswrapper[4713]: I0308 00:24:06.213545 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29548818-c92cn"] Mar 08 00:24:06 crc kubenswrapper[4713]: I0308 00:24:06.218463 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29548818-c92cn"] Mar 08 00:24:06 crc kubenswrapper[4713]: I0308 00:24:06.548534 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbf256d4-02b4-46fd-86a1-793e34a17bf5" path="/var/lib/kubelet/pods/bbf256d4-02b4-46fd-86a1-793e34a17bf5/volumes" Mar 08 00:24:14 crc kubenswrapper[4713]: I0308 00:24:14.964464 4713 scope.go:117] "RemoveContainer" containerID="0f83288064679e56b151b6696b75672f2d4637476a38071e252b04509b88078f" Mar 08 00:24:51 crc kubenswrapper[4713]: I0308 00:24:51.223848 4713 generic.go:334] "Generic (PLEG): container finished" podID="036ba45b-c97e-4ac4-a537-373dfa81f0de" containerID="e91504847d35f8027e57aadd536ccdea7215ff603a6a8dc70b2cb358b3c880ab" exitCode=0 Mar 08 00:24:51 crc kubenswrapper[4713]: I0308 00:24:51.223898 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"036ba45b-c97e-4ac4-a537-373dfa81f0de","Type":"ContainerDied","Data":"e91504847d35f8027e57aadd536ccdea7215ff603a6a8dc70b2cb358b3c880ab"} Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.474815 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.617628 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/036ba45b-c97e-4ac4-a537-373dfa81f0de-buildcachedir\") pod \"036ba45b-c97e-4ac4-a537-373dfa81f0de\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.617680 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/036ba45b-c97e-4ac4-a537-373dfa81f0de-builder-dockercfg-ptp88-pull\") pod \"036ba45b-c97e-4ac4-a537-373dfa81f0de\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.617709 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-proxy-ca-bundles\") pod \"036ba45b-c97e-4ac4-a537-373dfa81f0de\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.617729 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-buildworkdir\") pod \"036ba45b-c97e-4ac4-a537-373dfa81f0de\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.617756 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fftds\" (UniqueName: \"kubernetes.io/projected/036ba45b-c97e-4ac4-a537-373dfa81f0de-kube-api-access-fftds\") pod \"036ba45b-c97e-4ac4-a537-373dfa81f0de\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.617778 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-system-configs\") pod \"036ba45b-c97e-4ac4-a537-373dfa81f0de\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.617795 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-container-storage-run\") pod \"036ba45b-c97e-4ac4-a537-373dfa81f0de\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.617931 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/036ba45b-c97e-4ac4-a537-373dfa81f0de-node-pullsecrets\") pod \"036ba45b-c97e-4ac4-a537-373dfa81f0de\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.617956 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-ca-bundles\") pod \"036ba45b-c97e-4ac4-a537-373dfa81f0de\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.617996 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-blob-cache\") pod \"036ba45b-c97e-4ac4-a537-373dfa81f0de\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.618015 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-container-storage-root\") pod \"036ba45b-c97e-4ac4-a537-373dfa81f0de\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.618040 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/036ba45b-c97e-4ac4-a537-373dfa81f0de-builder-dockercfg-ptp88-push\") pod \"036ba45b-c97e-4ac4-a537-373dfa81f0de\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.618288 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/036ba45b-c97e-4ac4-a537-373dfa81f0de-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "036ba45b-c97e-4ac4-a537-373dfa81f0de" (UID: "036ba45b-c97e-4ac4-a537-373dfa81f0de"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.618578 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/036ba45b-c97e-4ac4-a537-373dfa81f0de-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.618971 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/036ba45b-c97e-4ac4-a537-373dfa81f0de-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "036ba45b-c97e-4ac4-a537-373dfa81f0de" (UID: "036ba45b-c97e-4ac4-a537-373dfa81f0de"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.619478 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "036ba45b-c97e-4ac4-a537-373dfa81f0de" (UID: "036ba45b-c97e-4ac4-a537-373dfa81f0de"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.621974 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "036ba45b-c97e-4ac4-a537-373dfa81f0de" (UID: "036ba45b-c97e-4ac4-a537-373dfa81f0de"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.622360 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "036ba45b-c97e-4ac4-a537-373dfa81f0de" (UID: "036ba45b-c97e-4ac4-a537-373dfa81f0de"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.622713 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "036ba45b-c97e-4ac4-a537-373dfa81f0de" (UID: "036ba45b-c97e-4ac4-a537-373dfa81f0de"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.623187 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "036ba45b-c97e-4ac4-a537-373dfa81f0de" (UID: "036ba45b-c97e-4ac4-a537-373dfa81f0de"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.623808 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/036ba45b-c97e-4ac4-a537-373dfa81f0de-builder-dockercfg-ptp88-pull" (OuterVolumeSpecName: "builder-dockercfg-ptp88-pull") pod "036ba45b-c97e-4ac4-a537-373dfa81f0de" (UID: "036ba45b-c97e-4ac4-a537-373dfa81f0de"). InnerVolumeSpecName "builder-dockercfg-ptp88-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.624606 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/036ba45b-c97e-4ac4-a537-373dfa81f0de-builder-dockercfg-ptp88-push" (OuterVolumeSpecName: "builder-dockercfg-ptp88-push") pod "036ba45b-c97e-4ac4-a537-373dfa81f0de" (UID: "036ba45b-c97e-4ac4-a537-373dfa81f0de"). InnerVolumeSpecName "builder-dockercfg-ptp88-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.626260 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/036ba45b-c97e-4ac4-a537-373dfa81f0de-kube-api-access-fftds" (OuterVolumeSpecName: "kube-api-access-fftds") pod "036ba45b-c97e-4ac4-a537-373dfa81f0de" (UID: "036ba45b-c97e-4ac4-a537-373dfa81f0de"). InnerVolumeSpecName "kube-api-access-fftds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.719472 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.719497 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.719507 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fftds\" (UniqueName: \"kubernetes.io/projected/036ba45b-c97e-4ac4-a537-373dfa81f0de-kube-api-access-fftds\") on node \"crc\" DevicePath \"\"" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.719515 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.719525 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.719534 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.719542 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/036ba45b-c97e-4ac4-a537-373dfa81f0de-builder-dockercfg-ptp88-push\") on node \"crc\" DevicePath \"\"" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.719550 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/036ba45b-c97e-4ac4-a537-373dfa81f0de-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.719559 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/036ba45b-c97e-4ac4-a537-373dfa81f0de-builder-dockercfg-ptp88-pull\") on node \"crc\" DevicePath \"\"" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.814964 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "036ba45b-c97e-4ac4-a537-373dfa81f0de" (UID: "036ba45b-c97e-4ac4-a537-373dfa81f0de"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.820437 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 08 00:24:53 crc kubenswrapper[4713]: I0308 00:24:53.237359 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"036ba45b-c97e-4ac4-a537-373dfa81f0de","Type":"ContainerDied","Data":"99e67191fc98c3ca6b2a46bb30dbdf3d717dba25a505f668389d0a3c46fc65b7"} Mar 08 00:24:53 crc kubenswrapper[4713]: I0308 00:24:53.237396 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99e67191fc98c3ca6b2a46bb30dbdf3d717dba25a505f668389d0a3c46fc65b7" Mar 08 00:24:53 crc kubenswrapper[4713]: I0308 00:24:53.237419 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:24:54 crc kubenswrapper[4713]: I0308 00:24:54.375461 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "036ba45b-c97e-4ac4-a537-373dfa81f0de" (UID: "036ba45b-c97e-4ac4-a537-373dfa81f0de"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:24:54 crc kubenswrapper[4713]: I0308 00:24:54.444951 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.526130 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 08 00:24:56 crc kubenswrapper[4713]: E0308 00:24:56.527970 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="036ba45b-c97e-4ac4-a537-373dfa81f0de" containerName="docker-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.528067 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="036ba45b-c97e-4ac4-a537-373dfa81f0de" containerName="docker-build" Mar 08 00:24:56 crc kubenswrapper[4713]: E0308 00:24:56.528145 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="036ba45b-c97e-4ac4-a537-373dfa81f0de" containerName="manage-dockerfile" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.528239 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="036ba45b-c97e-4ac4-a537-373dfa81f0de" containerName="manage-dockerfile" Mar 08 00:24:56 crc kubenswrapper[4713]: E0308 00:24:56.528312 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="036ba45b-c97e-4ac4-a537-373dfa81f0de" containerName="git-clone" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.528398 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="036ba45b-c97e-4ac4-a537-373dfa81f0de" containerName="git-clone" Mar 08 00:24:56 crc kubenswrapper[4713]: E0308 00:24:56.528471 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42829204-3911-4926-bcab-0e8f7b731986" containerName="oc" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.528544 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="42829204-3911-4926-bcab-0e8f7b731986" containerName="oc" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.528738 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="42829204-3911-4926-bcab-0e8f7b731986" containerName="oc" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.528854 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="036ba45b-c97e-4ac4-a537-373dfa81f0de" containerName="docker-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.529673 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.531747 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-ca" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.532085 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-global-ca" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.532329 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-sys-config" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.532638 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-ptp88" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.549747 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.675127 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/75b6be2f-9bac-4c3b-94b5-7a063d891561-builder-dockercfg-ptp88-push\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.675175 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-container-storage-run\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.675196 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-buildworkdir\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.675218 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-container-storage-root\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.675238 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pndlx\" (UniqueName: \"kubernetes.io/projected/75b6be2f-9bac-4c3b-94b5-7a063d891561-kube-api-access-pndlx\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.675260 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.675288 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/75b6be2f-9bac-4c3b-94b5-7a063d891561-buildcachedir\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.675306 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.675328 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/75b6be2f-9bac-4c3b-94b5-7a063d891561-builder-dockercfg-ptp88-pull\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.675351 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.675370 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-system-configs\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.675386 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/75b6be2f-9bac-4c3b-94b5-7a063d891561-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.777120 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/75b6be2f-9bac-4c3b-94b5-7a063d891561-builder-dockercfg-ptp88-pull\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778173 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778233 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-system-configs\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778258 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/75b6be2f-9bac-4c3b-94b5-7a063d891561-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778303 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/75b6be2f-9bac-4c3b-94b5-7a063d891561-builder-dockercfg-ptp88-push\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778346 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-container-storage-run\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778379 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-buildworkdir\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778405 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-container-storage-root\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778454 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pndlx\" (UniqueName: \"kubernetes.io/projected/75b6be2f-9bac-4c3b-94b5-7a063d891561-kube-api-access-pndlx\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778504 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778569 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/75b6be2f-9bac-4c3b-94b5-7a063d891561-buildcachedir\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778608 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778751 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-container-storage-run\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778848 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-buildworkdir\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778859 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778450 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/75b6be2f-9bac-4c3b-94b5-7a063d891561-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778917 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-system-configs\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778925 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778971 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/75b6be2f-9bac-4c3b-94b5-7a063d891561-buildcachedir\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.779084 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-container-storage-root\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.779681 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.784457 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/75b6be2f-9bac-4c3b-94b5-7a063d891561-builder-dockercfg-ptp88-push\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.794100 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/75b6be2f-9bac-4c3b-94b5-7a063d891561-builder-dockercfg-ptp88-pull\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.798197 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pndlx\" (UniqueName: \"kubernetes.io/projected/75b6be2f-9bac-4c3b-94b5-7a063d891561-kube-api-access-pndlx\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.851013 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 08 00:24:57 crc kubenswrapper[4713]: I0308 00:24:57.066399 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 08 00:24:57 crc kubenswrapper[4713]: I0308 00:24:57.262257 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"75b6be2f-9bac-4c3b-94b5-7a063d891561","Type":"ContainerStarted","Data":"af27dda306199e60ac82c42d73b874c62495ade449cd1c0b0121c417c647d7e1"} Mar 08 00:24:59 crc kubenswrapper[4713]: I0308 00:24:59.277970 4713 generic.go:334] "Generic (PLEG): container finished" podID="75b6be2f-9bac-4c3b-94b5-7a063d891561" containerID="820cbae741a6e2d5638bbd708e83fc7ff3413d84da68f379ee5daacc65d0210d" exitCode=0 Mar 08 00:24:59 crc kubenswrapper[4713]: I0308 00:24:59.278067 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"75b6be2f-9bac-4c3b-94b5-7a063d891561","Type":"ContainerDied","Data":"820cbae741a6e2d5638bbd708e83fc7ff3413d84da68f379ee5daacc65d0210d"} Mar 08 00:25:00 crc kubenswrapper[4713]: I0308 00:25:00.287590 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"75b6be2f-9bac-4c3b-94b5-7a063d891561","Type":"ContainerStarted","Data":"73f917ff900abac7775688508758d5acc8574a07df218ff492d7d488ac8aea76"} Mar 08 00:25:00 crc kubenswrapper[4713]: I0308 00:25:00.313106 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-1-build" podStartSLOduration=4.313088232 podStartE2EDuration="4.313088232s" podCreationTimestamp="2026-03-08 00:24:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:25:00.31188527 +0000 UTC m=+1154.431517503" watchObservedRunningTime="2026-03-08 00:25:00.313088232 +0000 UTC m=+1154.432720465" Mar 08 00:25:07 crc kubenswrapper[4713]: I0308 00:25:07.064856 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 08 00:25:07 crc kubenswrapper[4713]: I0308 00:25:07.066623 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/sg-core-1-build" podUID="75b6be2f-9bac-4c3b-94b5-7a063d891561" containerName="docker-build" containerID="cri-o://73f917ff900abac7775688508758d5acc8574a07df218ff492d7d488ac8aea76" gracePeriod=30 Mar 08 00:25:07 crc kubenswrapper[4713]: I0308 00:25:07.328461 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_75b6be2f-9bac-4c3b-94b5-7a063d891561/docker-build/0.log" Mar 08 00:25:07 crc kubenswrapper[4713]: I0308 00:25:07.328870 4713 generic.go:334] "Generic (PLEG): container finished" podID="75b6be2f-9bac-4c3b-94b5-7a063d891561" containerID="73f917ff900abac7775688508758d5acc8574a07df218ff492d7d488ac8aea76" exitCode=1 Mar 08 00:25:07 crc kubenswrapper[4713]: I0308 00:25:07.328916 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"75b6be2f-9bac-4c3b-94b5-7a063d891561","Type":"ContainerDied","Data":"73f917ff900abac7775688508758d5acc8574a07df218ff492d7d488ac8aea76"} Mar 08 00:25:07 crc kubenswrapper[4713]: I0308 00:25:07.956492 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_75b6be2f-9bac-4c3b-94b5-7a063d891561/docker-build/0.log" Mar 08 00:25:07 crc kubenswrapper[4713]: I0308 00:25:07.958280 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.125366 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pndlx\" (UniqueName: \"kubernetes.io/projected/75b6be2f-9bac-4c3b-94b5-7a063d891561-kube-api-access-pndlx\") pod \"75b6be2f-9bac-4c3b-94b5-7a063d891561\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.125452 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/75b6be2f-9bac-4c3b-94b5-7a063d891561-buildcachedir\") pod \"75b6be2f-9bac-4c3b-94b5-7a063d891561\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.125514 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/75b6be2f-9bac-4c3b-94b5-7a063d891561-builder-dockercfg-ptp88-pull\") pod \"75b6be2f-9bac-4c3b-94b5-7a063d891561\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.125543 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-buildworkdir\") pod \"75b6be2f-9bac-4c3b-94b5-7a063d891561\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.125563 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-container-storage-root\") pod \"75b6be2f-9bac-4c3b-94b5-7a063d891561\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.125627 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-proxy-ca-bundles\") pod \"75b6be2f-9bac-4c3b-94b5-7a063d891561\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.125655 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-container-storage-run\") pod \"75b6be2f-9bac-4c3b-94b5-7a063d891561\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.125687 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-blob-cache\") pod \"75b6be2f-9bac-4c3b-94b5-7a063d891561\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.125722 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-ca-bundles\") pod \"75b6be2f-9bac-4c3b-94b5-7a063d891561\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.125747 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-system-configs\") pod \"75b6be2f-9bac-4c3b-94b5-7a063d891561\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.125775 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/75b6be2f-9bac-4c3b-94b5-7a063d891561-node-pullsecrets\") pod \"75b6be2f-9bac-4c3b-94b5-7a063d891561\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.125848 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/75b6be2f-9bac-4c3b-94b5-7a063d891561-builder-dockercfg-ptp88-push\") pod \"75b6be2f-9bac-4c3b-94b5-7a063d891561\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.126181 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/75b6be2f-9bac-4c3b-94b5-7a063d891561-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "75b6be2f-9bac-4c3b-94b5-7a063d891561" (UID: "75b6be2f-9bac-4c3b-94b5-7a063d891561"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.126387 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/75b6be2f-9bac-4c3b-94b5-7a063d891561-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "75b6be2f-9bac-4c3b-94b5-7a063d891561" (UID: "75b6be2f-9bac-4c3b-94b5-7a063d891561"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.127008 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "75b6be2f-9bac-4c3b-94b5-7a063d891561" (UID: "75b6be2f-9bac-4c3b-94b5-7a063d891561"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.127083 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "75b6be2f-9bac-4c3b-94b5-7a063d891561" (UID: "75b6be2f-9bac-4c3b-94b5-7a063d891561"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.128064 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "75b6be2f-9bac-4c3b-94b5-7a063d891561" (UID: "75b6be2f-9bac-4c3b-94b5-7a063d891561"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.128073 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "75b6be2f-9bac-4c3b-94b5-7a063d891561" (UID: "75b6be2f-9bac-4c3b-94b5-7a063d891561"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.128485 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "75b6be2f-9bac-4c3b-94b5-7a063d891561" (UID: "75b6be2f-9bac-4c3b-94b5-7a063d891561"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.132025 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75b6be2f-9bac-4c3b-94b5-7a063d891561-builder-dockercfg-ptp88-pull" (OuterVolumeSpecName: "builder-dockercfg-ptp88-pull") pod "75b6be2f-9bac-4c3b-94b5-7a063d891561" (UID: "75b6be2f-9bac-4c3b-94b5-7a063d891561"). InnerVolumeSpecName "builder-dockercfg-ptp88-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.132291 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75b6be2f-9bac-4c3b-94b5-7a063d891561-builder-dockercfg-ptp88-push" (OuterVolumeSpecName: "builder-dockercfg-ptp88-push") pod "75b6be2f-9bac-4c3b-94b5-7a063d891561" (UID: "75b6be2f-9bac-4c3b-94b5-7a063d891561"). InnerVolumeSpecName "builder-dockercfg-ptp88-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.132864 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75b6be2f-9bac-4c3b-94b5-7a063d891561-kube-api-access-pndlx" (OuterVolumeSpecName: "kube-api-access-pndlx") pod "75b6be2f-9bac-4c3b-94b5-7a063d891561" (UID: "75b6be2f-9bac-4c3b-94b5-7a063d891561"). InnerVolumeSpecName "kube-api-access-pndlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.211083 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "75b6be2f-9bac-4c3b-94b5-7a063d891561" (UID: "75b6be2f-9bac-4c3b-94b5-7a063d891561"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.227668 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/75b6be2f-9bac-4c3b-94b5-7a063d891561-builder-dockercfg-ptp88-pull\") on node \"crc\" DevicePath \"\"" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.227704 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.227715 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.227725 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.227735 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.227744 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.227752 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.227763 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/75b6be2f-9bac-4c3b-94b5-7a063d891561-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.227774 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/75b6be2f-9bac-4c3b-94b5-7a063d891561-builder-dockercfg-ptp88-push\") on node \"crc\" DevicePath \"\"" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.227785 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pndlx\" (UniqueName: \"kubernetes.io/projected/75b6be2f-9bac-4c3b-94b5-7a063d891561-kube-api-access-pndlx\") on node \"crc\" DevicePath \"\"" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.227795 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/75b6be2f-9bac-4c3b-94b5-7a063d891561-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.245024 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "75b6be2f-9bac-4c3b-94b5-7a063d891561" (UID: "75b6be2f-9bac-4c3b-94b5-7a063d891561"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.328524 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.336362 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_75b6be2f-9bac-4c3b-94b5-7a063d891561/docker-build/0.log" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.336890 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"75b6be2f-9bac-4c3b-94b5-7a063d891561","Type":"ContainerDied","Data":"af27dda306199e60ac82c42d73b874c62495ade449cd1c0b0121c417c647d7e1"} Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.336929 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.336938 4713 scope.go:117] "RemoveContainer" containerID="73f917ff900abac7775688508758d5acc8574a07df218ff492d7d488ac8aea76" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.372844 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.378003 4713 scope.go:117] "RemoveContainer" containerID="820cbae741a6e2d5638bbd708e83fc7ff3413d84da68f379ee5daacc65d0210d" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.381589 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.548160 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75b6be2f-9bac-4c3b-94b5-7a063d891561" path="/var/lib/kubelet/pods/75b6be2f-9bac-4c3b-94b5-7a063d891561/volumes" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.697866 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 08 00:25:08 crc kubenswrapper[4713]: E0308 00:25:08.698161 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75b6be2f-9bac-4c3b-94b5-7a063d891561" containerName="docker-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.698174 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="75b6be2f-9bac-4c3b-94b5-7a063d891561" containerName="docker-build" Mar 08 00:25:08 crc kubenswrapper[4713]: E0308 00:25:08.698187 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75b6be2f-9bac-4c3b-94b5-7a063d891561" containerName="manage-dockerfile" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.698194 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="75b6be2f-9bac-4c3b-94b5-7a063d891561" containerName="manage-dockerfile" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.698312 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="75b6be2f-9bac-4c3b-94b5-7a063d891561" containerName="docker-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.699112 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.702508 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-sys-config" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.702511 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-ca" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.704371 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-ptp88" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.704493 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-global-ca" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.713068 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.835563 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-buildworkdir\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.835616 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b950bb15-0796-4aa8-9920-6c0d3dd622e7-buildcachedir\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.835635 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.835683 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-container-storage-root\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.835706 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/b950bb15-0796-4aa8-9920-6c0d3dd622e7-builder-dockercfg-ptp88-push\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.835721 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b950bb15-0796-4aa8-9920-6c0d3dd622e7-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.835737 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwq72\" (UniqueName: \"kubernetes.io/projected/b950bb15-0796-4aa8-9920-6c0d3dd622e7-kube-api-access-wwq72\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.835763 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/b950bb15-0796-4aa8-9920-6c0d3dd622e7-builder-dockercfg-ptp88-pull\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.835777 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-container-storage-run\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.835842 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.835874 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-system-configs\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.835891 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.936781 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-buildworkdir\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.936946 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b950bb15-0796-4aa8-9920-6c0d3dd622e7-buildcachedir\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.936983 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.937028 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-container-storage-root\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.937067 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/b950bb15-0796-4aa8-9920-6c0d3dd622e7-builder-dockercfg-ptp88-push\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.937073 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b950bb15-0796-4aa8-9920-6c0d3dd622e7-buildcachedir\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.937101 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b950bb15-0796-4aa8-9920-6c0d3dd622e7-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.937135 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwq72\" (UniqueName: \"kubernetes.io/projected/b950bb15-0796-4aa8-9920-6c0d3dd622e7-kube-api-access-wwq72\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.937195 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/b950bb15-0796-4aa8-9920-6c0d3dd622e7-builder-dockercfg-ptp88-pull\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.937224 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-container-storage-run\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.937263 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.937272 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-buildworkdir\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.937300 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-system-configs\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.937296 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b950bb15-0796-4aa8-9920-6c0d3dd622e7-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.937335 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.937416 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.937567 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-container-storage-root\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.937897 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-container-storage-run\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.938331 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-system-configs\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.938382 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.939322 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.942423 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/b950bb15-0796-4aa8-9920-6c0d3dd622e7-builder-dockercfg-ptp88-pull\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.942462 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/b950bb15-0796-4aa8-9920-6c0d3dd622e7-builder-dockercfg-ptp88-push\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.954556 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwq72\" (UniqueName: \"kubernetes.io/projected/b950bb15-0796-4aa8-9920-6c0d3dd622e7-kube-api-access-wwq72\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:09 crc kubenswrapper[4713]: I0308 00:25:09.029673 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 08 00:25:09 crc kubenswrapper[4713]: I0308 00:25:09.210653 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 08 00:25:09 crc kubenswrapper[4713]: I0308 00:25:09.344596 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"b950bb15-0796-4aa8-9920-6c0d3dd622e7","Type":"ContainerStarted","Data":"545332d882267752a7fc0c2268f7c2474d414930cabb6e9b981fa928a8e47be4"} Mar 08 00:25:10 crc kubenswrapper[4713]: I0308 00:25:10.351990 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"b950bb15-0796-4aa8-9920-6c0d3dd622e7","Type":"ContainerStarted","Data":"0d8e9e92cf71cf9a49208f26bbb668a5ab48cd9e4a37c2c1942070005899b895"} Mar 08 00:25:11 crc kubenswrapper[4713]: I0308 00:25:11.363967 4713 generic.go:334] "Generic (PLEG): container finished" podID="b950bb15-0796-4aa8-9920-6c0d3dd622e7" containerID="0d8e9e92cf71cf9a49208f26bbb668a5ab48cd9e4a37c2c1942070005899b895" exitCode=0 Mar 08 00:25:11 crc kubenswrapper[4713]: I0308 00:25:11.364011 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"b950bb15-0796-4aa8-9920-6c0d3dd622e7","Type":"ContainerDied","Data":"0d8e9e92cf71cf9a49208f26bbb668a5ab48cd9e4a37c2c1942070005899b895"} Mar 08 00:25:12 crc kubenswrapper[4713]: I0308 00:25:12.373956 4713 generic.go:334] "Generic (PLEG): container finished" podID="b950bb15-0796-4aa8-9920-6c0d3dd622e7" containerID="10805c8581330d572333818f1f8b595a89a5246c39de1a0d940c7497db5c499f" exitCode=0 Mar 08 00:25:12 crc kubenswrapper[4713]: I0308 00:25:12.374046 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"b950bb15-0796-4aa8-9920-6c0d3dd622e7","Type":"ContainerDied","Data":"10805c8581330d572333818f1f8b595a89a5246c39de1a0d940c7497db5c499f"} Mar 08 00:25:12 crc kubenswrapper[4713]: I0308 00:25:12.407706 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-2-build_b950bb15-0796-4aa8-9920-6c0d3dd622e7/manage-dockerfile/0.log" Mar 08 00:25:13 crc kubenswrapper[4713]: I0308 00:25:13.381996 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"b950bb15-0796-4aa8-9920-6c0d3dd622e7","Type":"ContainerStarted","Data":"9c2d53d2e25840ed0d4868a439ef4aa09e614db37d156b21c523365ff053b5e7"} Mar 08 00:25:13 crc kubenswrapper[4713]: I0308 00:25:13.413546 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-2-build" podStartSLOduration=5.413519053 podStartE2EDuration="5.413519053s" podCreationTimestamp="2026-03-08 00:25:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:25:13.405595695 +0000 UTC m=+1167.525227948" watchObservedRunningTime="2026-03-08 00:25:13.413519053 +0000 UTC m=+1167.533151296" Mar 08 00:25:34 crc kubenswrapper[4713]: I0308 00:25:34.501293 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:25:34 crc kubenswrapper[4713]: I0308 00:25:34.501911 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:26:00 crc kubenswrapper[4713]: I0308 00:26:00.137897 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548826-fhk5r"] Mar 08 00:26:00 crc kubenswrapper[4713]: I0308 00:26:00.139607 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548826-fhk5r" Mar 08 00:26:00 crc kubenswrapper[4713]: I0308 00:26:00.142235 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:26:00 crc kubenswrapper[4713]: I0308 00:26:00.142754 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:26:00 crc kubenswrapper[4713]: I0308 00:26:00.142916 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:26:00 crc kubenswrapper[4713]: I0308 00:26:00.146344 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548826-fhk5r"] Mar 08 00:26:00 crc kubenswrapper[4713]: I0308 00:26:00.286211 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf7cq\" (UniqueName: \"kubernetes.io/projected/45fc1987-0bdc-476c-9315-18ddbf570461-kube-api-access-zf7cq\") pod \"auto-csr-approver-29548826-fhk5r\" (UID: \"45fc1987-0bdc-476c-9315-18ddbf570461\") " pod="openshift-infra/auto-csr-approver-29548826-fhk5r" Mar 08 00:26:00 crc kubenswrapper[4713]: I0308 00:26:00.387138 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf7cq\" (UniqueName: \"kubernetes.io/projected/45fc1987-0bdc-476c-9315-18ddbf570461-kube-api-access-zf7cq\") pod \"auto-csr-approver-29548826-fhk5r\" (UID: \"45fc1987-0bdc-476c-9315-18ddbf570461\") " pod="openshift-infra/auto-csr-approver-29548826-fhk5r" Mar 08 00:26:00 crc kubenswrapper[4713]: I0308 00:26:00.406770 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf7cq\" (UniqueName: \"kubernetes.io/projected/45fc1987-0bdc-476c-9315-18ddbf570461-kube-api-access-zf7cq\") pod \"auto-csr-approver-29548826-fhk5r\" (UID: \"45fc1987-0bdc-476c-9315-18ddbf570461\") " pod="openshift-infra/auto-csr-approver-29548826-fhk5r" Mar 08 00:26:00 crc kubenswrapper[4713]: I0308 00:26:00.458055 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548826-fhk5r" Mar 08 00:26:00 crc kubenswrapper[4713]: I0308 00:26:00.872431 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548826-fhk5r"] Mar 08 00:26:01 crc kubenswrapper[4713]: I0308 00:26:01.667853 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548826-fhk5r" event={"ID":"45fc1987-0bdc-476c-9315-18ddbf570461","Type":"ContainerStarted","Data":"badce5250d1b5ad4223d4d020e98203d0342b8010e59163b7be0bc706789e8d6"} Mar 08 00:26:03 crc kubenswrapper[4713]: I0308 00:26:03.683924 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548826-fhk5r" event={"ID":"45fc1987-0bdc-476c-9315-18ddbf570461","Type":"ContainerStarted","Data":"76cb1ca43446adb6dc230f530d8737aea0a1011651185fc5861e17e4b5ae2a6c"} Mar 08 00:26:03 crc kubenswrapper[4713]: I0308 00:26:03.701123 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29548826-fhk5r" podStartSLOduration=1.178882876 podStartE2EDuration="3.70110945s" podCreationTimestamp="2026-03-08 00:26:00 +0000 UTC" firstStartedPulling="2026-03-08 00:26:00.884555115 +0000 UTC m=+1215.004187348" lastFinishedPulling="2026-03-08 00:26:03.406781689 +0000 UTC m=+1217.526413922" observedRunningTime="2026-03-08 00:26:03.699452936 +0000 UTC m=+1217.819085189" watchObservedRunningTime="2026-03-08 00:26:03.70110945 +0000 UTC m=+1217.820741683" Mar 08 00:26:04 crc kubenswrapper[4713]: I0308 00:26:04.501110 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:26:04 crc kubenswrapper[4713]: I0308 00:26:04.501169 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:26:04 crc kubenswrapper[4713]: I0308 00:26:04.691370 4713 generic.go:334] "Generic (PLEG): container finished" podID="45fc1987-0bdc-476c-9315-18ddbf570461" containerID="76cb1ca43446adb6dc230f530d8737aea0a1011651185fc5861e17e4b5ae2a6c" exitCode=0 Mar 08 00:26:04 crc kubenswrapper[4713]: I0308 00:26:04.691434 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548826-fhk5r" event={"ID":"45fc1987-0bdc-476c-9315-18ddbf570461","Type":"ContainerDied","Data":"76cb1ca43446adb6dc230f530d8737aea0a1011651185fc5861e17e4b5ae2a6c"} Mar 08 00:26:05 crc kubenswrapper[4713]: I0308 00:26:05.983219 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548826-fhk5r" Mar 08 00:26:06 crc kubenswrapper[4713]: I0308 00:26:06.068059 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf7cq\" (UniqueName: \"kubernetes.io/projected/45fc1987-0bdc-476c-9315-18ddbf570461-kube-api-access-zf7cq\") pod \"45fc1987-0bdc-476c-9315-18ddbf570461\" (UID: \"45fc1987-0bdc-476c-9315-18ddbf570461\") " Mar 08 00:26:06 crc kubenswrapper[4713]: I0308 00:26:06.076031 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45fc1987-0bdc-476c-9315-18ddbf570461-kube-api-access-zf7cq" (OuterVolumeSpecName: "kube-api-access-zf7cq") pod "45fc1987-0bdc-476c-9315-18ddbf570461" (UID: "45fc1987-0bdc-476c-9315-18ddbf570461"). InnerVolumeSpecName "kube-api-access-zf7cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:26:06 crc kubenswrapper[4713]: I0308 00:26:06.169972 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf7cq\" (UniqueName: \"kubernetes.io/projected/45fc1987-0bdc-476c-9315-18ddbf570461-kube-api-access-zf7cq\") on node \"crc\" DevicePath \"\"" Mar 08 00:26:06 crc kubenswrapper[4713]: I0308 00:26:06.704783 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548826-fhk5r" event={"ID":"45fc1987-0bdc-476c-9315-18ddbf570461","Type":"ContainerDied","Data":"badce5250d1b5ad4223d4d020e98203d0342b8010e59163b7be0bc706789e8d6"} Mar 08 00:26:06 crc kubenswrapper[4713]: I0308 00:26:06.704845 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="badce5250d1b5ad4223d4d020e98203d0342b8010e59163b7be0bc706789e8d6" Mar 08 00:26:06 crc kubenswrapper[4713]: I0308 00:26:06.704854 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548826-fhk5r" Mar 08 00:26:06 crc kubenswrapper[4713]: I0308 00:26:06.750763 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29548820-cts7b"] Mar 08 00:26:06 crc kubenswrapper[4713]: I0308 00:26:06.757760 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29548820-cts7b"] Mar 08 00:26:08 crc kubenswrapper[4713]: I0308 00:26:08.549562 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c62a3d3-0f8a-40d6-a2f0-b860e9c85085" path="/var/lib/kubelet/pods/8c62a3d3-0f8a-40d6-a2f0-b860e9c85085/volumes" Mar 08 00:26:15 crc kubenswrapper[4713]: I0308 00:26:15.058972 4713 scope.go:117] "RemoveContainer" containerID="f841e6785162901f02d099ef1f13977229ba672ec5a1c4b87a1f7c3c310267fe" Mar 08 00:26:34 crc kubenswrapper[4713]: I0308 00:26:34.501063 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:26:34 crc kubenswrapper[4713]: I0308 00:26:34.501633 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:26:34 crc kubenswrapper[4713]: I0308 00:26:34.501683 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:26:34 crc kubenswrapper[4713]: I0308 00:26:34.502299 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c9719f0bfb278b285d17679470509ae6172a8ecfd762a13c6a85c14fdaf89f7f"} pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 00:26:34 crc kubenswrapper[4713]: I0308 00:26:34.502367 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" containerID="cri-o://c9719f0bfb278b285d17679470509ae6172a8ecfd762a13c6a85c14fdaf89f7f" gracePeriod=600 Mar 08 00:26:34 crc kubenswrapper[4713]: I0308 00:26:34.886748 4713 generic.go:334] "Generic (PLEG): container finished" podID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerID="c9719f0bfb278b285d17679470509ae6172a8ecfd762a13c6a85c14fdaf89f7f" exitCode=0 Mar 08 00:26:34 crc kubenswrapper[4713]: I0308 00:26:34.886852 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerDied","Data":"c9719f0bfb278b285d17679470509ae6172a8ecfd762a13c6a85c14fdaf89f7f"} Mar 08 00:26:34 crc kubenswrapper[4713]: I0308 00:26:34.887591 4713 scope.go:117] "RemoveContainer" containerID="c05ee6e5a19168a6d6242d209054a09db1bc72634110e6c102d8134908c2acc0" Mar 08 00:26:35 crc kubenswrapper[4713]: I0308 00:26:35.897010 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerStarted","Data":"bbcc55077b8279f43ab1318272be3487b4b4457dea7182ea0e9d79f49619de4c"} Mar 08 00:28:00 crc kubenswrapper[4713]: I0308 00:28:00.137704 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548828-b8fft"] Mar 08 00:28:00 crc kubenswrapper[4713]: E0308 00:28:00.138657 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45fc1987-0bdc-476c-9315-18ddbf570461" containerName="oc" Mar 08 00:28:00 crc kubenswrapper[4713]: I0308 00:28:00.138679 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="45fc1987-0bdc-476c-9315-18ddbf570461" containerName="oc" Mar 08 00:28:00 crc kubenswrapper[4713]: I0308 00:28:00.138862 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="45fc1987-0bdc-476c-9315-18ddbf570461" containerName="oc" Mar 08 00:28:00 crc kubenswrapper[4713]: I0308 00:28:00.139365 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548828-b8fft" Mar 08 00:28:00 crc kubenswrapper[4713]: I0308 00:28:00.143964 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:28:00 crc kubenswrapper[4713]: I0308 00:28:00.144065 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:28:00 crc kubenswrapper[4713]: I0308 00:28:00.144243 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:28:00 crc kubenswrapper[4713]: I0308 00:28:00.146395 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548828-b8fft"] Mar 08 00:28:00 crc kubenswrapper[4713]: I0308 00:28:00.312910 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlxdg\" (UniqueName: \"kubernetes.io/projected/91f9ab32-0c71-4b60-b499-75b2f4f4dcf3-kube-api-access-rlxdg\") pod \"auto-csr-approver-29548828-b8fft\" (UID: \"91f9ab32-0c71-4b60-b499-75b2f4f4dcf3\") " pod="openshift-infra/auto-csr-approver-29548828-b8fft" Mar 08 00:28:00 crc kubenswrapper[4713]: I0308 00:28:00.414160 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlxdg\" (UniqueName: \"kubernetes.io/projected/91f9ab32-0c71-4b60-b499-75b2f4f4dcf3-kube-api-access-rlxdg\") pod \"auto-csr-approver-29548828-b8fft\" (UID: \"91f9ab32-0c71-4b60-b499-75b2f4f4dcf3\") " pod="openshift-infra/auto-csr-approver-29548828-b8fft" Mar 08 00:28:00 crc kubenswrapper[4713]: I0308 00:28:00.448607 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlxdg\" (UniqueName: \"kubernetes.io/projected/91f9ab32-0c71-4b60-b499-75b2f4f4dcf3-kube-api-access-rlxdg\") pod \"auto-csr-approver-29548828-b8fft\" (UID: \"91f9ab32-0c71-4b60-b499-75b2f4f4dcf3\") " pod="openshift-infra/auto-csr-approver-29548828-b8fft" Mar 08 00:28:00 crc kubenswrapper[4713]: I0308 00:28:00.467453 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548828-b8fft" Mar 08 00:28:00 crc kubenswrapper[4713]: I0308 00:28:00.886485 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548828-b8fft"] Mar 08 00:28:00 crc kubenswrapper[4713]: W0308 00:28:00.891793 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f9ab32_0c71_4b60_b499_75b2f4f4dcf3.slice/crio-39dae9c861d36dec5f3f5bee86abcca2160152f47b8648d1c06815228f4b0d21 WatchSource:0}: Error finding container 39dae9c861d36dec5f3f5bee86abcca2160152f47b8648d1c06815228f4b0d21: Status 404 returned error can't find the container with id 39dae9c861d36dec5f3f5bee86abcca2160152f47b8648d1c06815228f4b0d21 Mar 08 00:28:00 crc kubenswrapper[4713]: I0308 00:28:00.894436 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 00:28:01 crc kubenswrapper[4713]: I0308 00:28:01.611164 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548828-b8fft" event={"ID":"91f9ab32-0c71-4b60-b499-75b2f4f4dcf3","Type":"ContainerStarted","Data":"39dae9c861d36dec5f3f5bee86abcca2160152f47b8648d1c06815228f4b0d21"} Mar 08 00:28:06 crc kubenswrapper[4713]: I0308 00:28:06.642611 4713 generic.go:334] "Generic (PLEG): container finished" podID="91f9ab32-0c71-4b60-b499-75b2f4f4dcf3" containerID="ef6200b05d87f80e3b68b8cd3aa4e78082a7e3103ea753de97cc7213a72cdd71" exitCode=0 Mar 08 00:28:06 crc kubenswrapper[4713]: I0308 00:28:06.642655 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548828-b8fft" event={"ID":"91f9ab32-0c71-4b60-b499-75b2f4f4dcf3","Type":"ContainerDied","Data":"ef6200b05d87f80e3b68b8cd3aa4e78082a7e3103ea753de97cc7213a72cdd71"} Mar 08 00:28:07 crc kubenswrapper[4713]: I0308 00:28:07.844886 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548828-b8fft" Mar 08 00:28:08 crc kubenswrapper[4713]: I0308 00:28:08.013305 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlxdg\" (UniqueName: \"kubernetes.io/projected/91f9ab32-0c71-4b60-b499-75b2f4f4dcf3-kube-api-access-rlxdg\") pod \"91f9ab32-0c71-4b60-b499-75b2f4f4dcf3\" (UID: \"91f9ab32-0c71-4b60-b499-75b2f4f4dcf3\") " Mar 08 00:28:08 crc kubenswrapper[4713]: I0308 00:28:08.019362 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91f9ab32-0c71-4b60-b499-75b2f4f4dcf3-kube-api-access-rlxdg" (OuterVolumeSpecName: "kube-api-access-rlxdg") pod "91f9ab32-0c71-4b60-b499-75b2f4f4dcf3" (UID: "91f9ab32-0c71-4b60-b499-75b2f4f4dcf3"). InnerVolumeSpecName "kube-api-access-rlxdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:28:08 crc kubenswrapper[4713]: I0308 00:28:08.115282 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlxdg\" (UniqueName: \"kubernetes.io/projected/91f9ab32-0c71-4b60-b499-75b2f4f4dcf3-kube-api-access-rlxdg\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:08 crc kubenswrapper[4713]: I0308 00:28:08.656054 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548828-b8fft" event={"ID":"91f9ab32-0c71-4b60-b499-75b2f4f4dcf3","Type":"ContainerDied","Data":"39dae9c861d36dec5f3f5bee86abcca2160152f47b8648d1c06815228f4b0d21"} Mar 08 00:28:08 crc kubenswrapper[4713]: I0308 00:28:08.656094 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39dae9c861d36dec5f3f5bee86abcca2160152f47b8648d1c06815228f4b0d21" Mar 08 00:28:08 crc kubenswrapper[4713]: I0308 00:28:08.656132 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548828-b8fft" Mar 08 00:28:08 crc kubenswrapper[4713]: I0308 00:28:08.897418 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29548822-zwqb8"] Mar 08 00:28:08 crc kubenswrapper[4713]: I0308 00:28:08.901758 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29548822-zwqb8"] Mar 08 00:28:10 crc kubenswrapper[4713]: I0308 00:28:10.555196 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="985fdd12-7009-419a-8098-df4c84849d22" path="/var/lib/kubelet/pods/985fdd12-7009-419a-8098-df4c84849d22/volumes" Mar 08 00:28:15 crc kubenswrapper[4713]: I0308 00:28:15.128955 4713 scope.go:117] "RemoveContainer" containerID="03f2240ea47d4e1505d29677bf54b0934fc0985bf6c6ce2acf97701158af0125" Mar 08 00:28:29 crc kubenswrapper[4713]: I0308 00:28:29.781350 4713 generic.go:334] "Generic (PLEG): container finished" podID="b950bb15-0796-4aa8-9920-6c0d3dd622e7" containerID="9c2d53d2e25840ed0d4868a439ef4aa09e614db37d156b21c523365ff053b5e7" exitCode=0 Mar 08 00:28:29 crc kubenswrapper[4713]: I0308 00:28:29.781431 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"b950bb15-0796-4aa8-9920-6c0d3dd622e7","Type":"ContainerDied","Data":"9c2d53d2e25840ed0d4868a439ef4aa09e614db37d156b21c523365ff053b5e7"} Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.022648 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.100664 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/b950bb15-0796-4aa8-9920-6c0d3dd622e7-builder-dockercfg-ptp88-pull\") pod \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.100718 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwq72\" (UniqueName: \"kubernetes.io/projected/b950bb15-0796-4aa8-9920-6c0d3dd622e7-kube-api-access-wwq72\") pod \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.100754 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/b950bb15-0796-4aa8-9920-6c0d3dd622e7-builder-dockercfg-ptp88-push\") pod \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.100797 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-system-configs\") pod \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.100859 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-blob-cache\") pod \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.100893 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b950bb15-0796-4aa8-9920-6c0d3dd622e7-buildcachedir\") pod \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.100922 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-container-storage-run\") pod \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.100968 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-ca-bundles\") pod \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.101000 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-proxy-ca-bundles\") pod \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.101011 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b950bb15-0796-4aa8-9920-6c0d3dd622e7-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "b950bb15-0796-4aa8-9920-6c0d3dd622e7" (UID: "b950bb15-0796-4aa8-9920-6c0d3dd622e7"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.101031 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b950bb15-0796-4aa8-9920-6c0d3dd622e7-node-pullsecrets\") pod \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.101060 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-buildworkdir\") pod \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.101100 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-container-storage-root\") pod \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.101377 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b950bb15-0796-4aa8-9920-6c0d3dd622e7-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.101489 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "b950bb15-0796-4aa8-9920-6c0d3dd622e7" (UID: "b950bb15-0796-4aa8-9920-6c0d3dd622e7"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.101556 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b950bb15-0796-4aa8-9920-6c0d3dd622e7-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "b950bb15-0796-4aa8-9920-6c0d3dd622e7" (UID: "b950bb15-0796-4aa8-9920-6c0d3dd622e7"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.101975 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "b950bb15-0796-4aa8-9920-6c0d3dd622e7" (UID: "b950bb15-0796-4aa8-9920-6c0d3dd622e7"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.102205 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "b950bb15-0796-4aa8-9920-6c0d3dd622e7" (UID: "b950bb15-0796-4aa8-9920-6c0d3dd622e7"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.102541 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "b950bb15-0796-4aa8-9920-6c0d3dd622e7" (UID: "b950bb15-0796-4aa8-9920-6c0d3dd622e7"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.105963 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b950bb15-0796-4aa8-9920-6c0d3dd622e7-builder-dockercfg-ptp88-push" (OuterVolumeSpecName: "builder-dockercfg-ptp88-push") pod "b950bb15-0796-4aa8-9920-6c0d3dd622e7" (UID: "b950bb15-0796-4aa8-9920-6c0d3dd622e7"). InnerVolumeSpecName "builder-dockercfg-ptp88-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.106270 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b950bb15-0796-4aa8-9920-6c0d3dd622e7-builder-dockercfg-ptp88-pull" (OuterVolumeSpecName: "builder-dockercfg-ptp88-pull") pod "b950bb15-0796-4aa8-9920-6c0d3dd622e7" (UID: "b950bb15-0796-4aa8-9920-6c0d3dd622e7"). InnerVolumeSpecName "builder-dockercfg-ptp88-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.106590 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b950bb15-0796-4aa8-9920-6c0d3dd622e7-kube-api-access-wwq72" (OuterVolumeSpecName: "kube-api-access-wwq72") pod "b950bb15-0796-4aa8-9920-6c0d3dd622e7" (UID: "b950bb15-0796-4aa8-9920-6c0d3dd622e7"). InnerVolumeSpecName "kube-api-access-wwq72". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.113198 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "b950bb15-0796-4aa8-9920-6c0d3dd622e7" (UID: "b950bb15-0796-4aa8-9920-6c0d3dd622e7"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.201946 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.201984 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.201996 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.202007 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.202018 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b950bb15-0796-4aa8-9920-6c0d3dd622e7-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.202029 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.202040 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/b950bb15-0796-4aa8-9920-6c0d3dd622e7-builder-dockercfg-ptp88-pull\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.202052 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwq72\" (UniqueName: \"kubernetes.io/projected/b950bb15-0796-4aa8-9920-6c0d3dd622e7-kube-api-access-wwq72\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.202062 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/b950bb15-0796-4aa8-9920-6c0d3dd622e7-builder-dockercfg-ptp88-push\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.446680 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "b950bb15-0796-4aa8-9920-6c0d3dd622e7" (UID: "b950bb15-0796-4aa8-9920-6c0d3dd622e7"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.505329 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.796588 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"b950bb15-0796-4aa8-9920-6c0d3dd622e7","Type":"ContainerDied","Data":"545332d882267752a7fc0c2268f7c2474d414930cabb6e9b981fa928a8e47be4"} Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.796630 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="545332d882267752a7fc0c2268f7c2474d414930cabb6e9b981fa928a8e47be4" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.796683 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 08 00:28:33 crc kubenswrapper[4713]: I0308 00:28:33.579927 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "b950bb15-0796-4aa8-9920-6c0d3dd622e7" (UID: "b950bb15-0796-4aa8-9920-6c0d3dd622e7"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:28:33 crc kubenswrapper[4713]: I0308 00:28:33.631660 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.527418 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 08 00:28:35 crc kubenswrapper[4713]: E0308 00:28:35.529125 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b950bb15-0796-4aa8-9920-6c0d3dd622e7" containerName="manage-dockerfile" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.529239 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="b950bb15-0796-4aa8-9920-6c0d3dd622e7" containerName="manage-dockerfile" Mar 08 00:28:35 crc kubenswrapper[4713]: E0308 00:28:35.529326 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b950bb15-0796-4aa8-9920-6c0d3dd622e7" containerName="docker-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.529403 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="b950bb15-0796-4aa8-9920-6c0d3dd622e7" containerName="docker-build" Mar 08 00:28:35 crc kubenswrapper[4713]: E0308 00:28:35.529482 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b950bb15-0796-4aa8-9920-6c0d3dd622e7" containerName="git-clone" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.529552 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="b950bb15-0796-4aa8-9920-6c0d3dd622e7" containerName="git-clone" Mar 08 00:28:35 crc kubenswrapper[4713]: E0308 00:28:35.529637 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91f9ab32-0c71-4b60-b499-75b2f4f4dcf3" containerName="oc" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.529714 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="91f9ab32-0c71-4b60-b499-75b2f4f4dcf3" containerName="oc" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.529954 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="b950bb15-0796-4aa8-9920-6c0d3dd622e7" containerName="docker-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.530088 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="91f9ab32-0c71-4b60-b499-75b2f4f4dcf3" containerName="oc" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.531326 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.533702 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-ca" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.534123 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-ptp88" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.534185 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-sys-config" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.534482 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-global-ca" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.544454 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.563085 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e709cdbe-6c8e-4853-85f3-453fc41a930d-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.563143 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.563218 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.563271 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szthp\" (UniqueName: \"kubernetes.io/projected/e709cdbe-6c8e-4853-85f3-453fc41a930d-kube-api-access-szthp\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.563310 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.563330 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.563354 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/e709cdbe-6c8e-4853-85f3-453fc41a930d-builder-dockercfg-ptp88-push\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.563413 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e709cdbe-6c8e-4853-85f3-453fc41a930d-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.563451 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.563472 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.563486 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.563504 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/e709cdbe-6c8e-4853-85f3-453fc41a930d-builder-dockercfg-ptp88-pull\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.665145 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e709cdbe-6c8e-4853-85f3-453fc41a930d-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.665186 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.665210 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.665231 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.665261 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/e709cdbe-6c8e-4853-85f3-453fc41a930d-builder-dockercfg-ptp88-pull\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.665266 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e709cdbe-6c8e-4853-85f3-453fc41a930d-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.665335 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e709cdbe-6c8e-4853-85f3-453fc41a930d-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.665359 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.665374 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.665396 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szthp\" (UniqueName: \"kubernetes.io/projected/e709cdbe-6c8e-4853-85f3-453fc41a930d-kube-api-access-szthp\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.665421 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.665437 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.665454 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/e709cdbe-6c8e-4853-85f3-453fc41a930d-builder-dockercfg-ptp88-push\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.665553 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e709cdbe-6c8e-4853-85f3-453fc41a930d-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.666220 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.666271 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.666326 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.666394 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.666426 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.666500 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.666711 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.671599 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/e709cdbe-6c8e-4853-85f3-453fc41a930d-builder-dockercfg-ptp88-pull\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.671700 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/e709cdbe-6c8e-4853-85f3-453fc41a930d-builder-dockercfg-ptp88-push\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.691445 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szthp\" (UniqueName: \"kubernetes.io/projected/e709cdbe-6c8e-4853-85f3-453fc41a930d-kube-api-access-szthp\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.845542 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:36 crc kubenswrapper[4713]: I0308 00:28:36.240743 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 08 00:28:36 crc kubenswrapper[4713]: I0308 00:28:36.826696 4713 generic.go:334] "Generic (PLEG): container finished" podID="e709cdbe-6c8e-4853-85f3-453fc41a930d" containerID="01d9b7b88d08637099f2699ad9a25e90c9327b764008cf2cde4f1f7e06061451" exitCode=0 Mar 08 00:28:36 crc kubenswrapper[4713]: I0308 00:28:36.826766 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"e709cdbe-6c8e-4853-85f3-453fc41a930d","Type":"ContainerDied","Data":"01d9b7b88d08637099f2699ad9a25e90c9327b764008cf2cde4f1f7e06061451"} Mar 08 00:28:36 crc kubenswrapper[4713]: I0308 00:28:36.827034 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"e709cdbe-6c8e-4853-85f3-453fc41a930d","Type":"ContainerStarted","Data":"6aa4b25cf897f6651cfcca2cf0d7068ae5c2ea57809dec6519deb3bd9cef0432"} Mar 08 00:28:37 crc kubenswrapper[4713]: I0308 00:28:37.835438 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"e709cdbe-6c8e-4853-85f3-453fc41a930d","Type":"ContainerStarted","Data":"0fd1776a90badc7eb6f79de68dfeed110b30a49d06c2f0b0856f0e37b49744ef"} Mar 08 00:28:37 crc kubenswrapper[4713]: I0308 00:28:37.859342 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-1-build" podStartSLOduration=2.8593250770000003 podStartE2EDuration="2.859325077s" podCreationTimestamp="2026-03-08 00:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:28:37.855754373 +0000 UTC m=+1371.975386606" watchObservedRunningTime="2026-03-08 00:28:37.859325077 +0000 UTC m=+1371.978957310" Mar 08 00:28:43 crc kubenswrapper[4713]: I0308 00:28:43.871747 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_e709cdbe-6c8e-4853-85f3-453fc41a930d/docker-build/0.log" Mar 08 00:28:43 crc kubenswrapper[4713]: I0308 00:28:43.872693 4713 generic.go:334] "Generic (PLEG): container finished" podID="e709cdbe-6c8e-4853-85f3-453fc41a930d" containerID="0fd1776a90badc7eb6f79de68dfeed110b30a49d06c2f0b0856f0e37b49744ef" exitCode=1 Mar 08 00:28:43 crc kubenswrapper[4713]: I0308 00:28:43.872740 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"e709cdbe-6c8e-4853-85f3-453fc41a930d","Type":"ContainerDied","Data":"0fd1776a90badc7eb6f79de68dfeed110b30a49d06c2f0b0856f0e37b49744ef"} Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.114805 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_e709cdbe-6c8e-4853-85f3-453fc41a930d/docker-build/0.log" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.115512 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.297271 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-proxy-ca-bundles\") pod \"e709cdbe-6c8e-4853-85f3-453fc41a930d\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.297315 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-blob-cache\") pod \"e709cdbe-6c8e-4853-85f3-453fc41a930d\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.297342 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e709cdbe-6c8e-4853-85f3-453fc41a930d-node-pullsecrets\") pod \"e709cdbe-6c8e-4853-85f3-453fc41a930d\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.297366 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-ca-bundles\") pod \"e709cdbe-6c8e-4853-85f3-453fc41a930d\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.297407 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szthp\" (UniqueName: \"kubernetes.io/projected/e709cdbe-6c8e-4853-85f3-453fc41a930d-kube-api-access-szthp\") pod \"e709cdbe-6c8e-4853-85f3-453fc41a930d\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.297425 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/e709cdbe-6c8e-4853-85f3-453fc41a930d-builder-dockercfg-ptp88-pull\") pod \"e709cdbe-6c8e-4853-85f3-453fc41a930d\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.297436 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e709cdbe-6c8e-4853-85f3-453fc41a930d-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "e709cdbe-6c8e-4853-85f3-453fc41a930d" (UID: "e709cdbe-6c8e-4853-85f3-453fc41a930d"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.297475 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-container-storage-run\") pod \"e709cdbe-6c8e-4853-85f3-453fc41a930d\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.297497 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-system-configs\") pod \"e709cdbe-6c8e-4853-85f3-453fc41a930d\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.297515 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/e709cdbe-6c8e-4853-85f3-453fc41a930d-builder-dockercfg-ptp88-push\") pod \"e709cdbe-6c8e-4853-85f3-453fc41a930d\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.297541 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e709cdbe-6c8e-4853-85f3-453fc41a930d-buildcachedir\") pod \"e709cdbe-6c8e-4853-85f3-453fc41a930d\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.297561 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-container-storage-root\") pod \"e709cdbe-6c8e-4853-85f3-453fc41a930d\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.297590 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-buildworkdir\") pod \"e709cdbe-6c8e-4853-85f3-453fc41a930d\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.297815 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e709cdbe-6c8e-4853-85f3-453fc41a930d-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.298246 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "e709cdbe-6c8e-4853-85f3-453fc41a930d" (UID: "e709cdbe-6c8e-4853-85f3-453fc41a930d"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.298267 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "e709cdbe-6c8e-4853-85f3-453fc41a930d" (UID: "e709cdbe-6c8e-4853-85f3-453fc41a930d"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.298250 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e709cdbe-6c8e-4853-85f3-453fc41a930d-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "e709cdbe-6c8e-4853-85f3-453fc41a930d" (UID: "e709cdbe-6c8e-4853-85f3-453fc41a930d"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.298396 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "e709cdbe-6c8e-4853-85f3-453fc41a930d" (UID: "e709cdbe-6c8e-4853-85f3-453fc41a930d"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.299188 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "e709cdbe-6c8e-4853-85f3-453fc41a930d" (UID: "e709cdbe-6c8e-4853-85f3-453fc41a930d"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.299354 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "e709cdbe-6c8e-4853-85f3-453fc41a930d" (UID: "e709cdbe-6c8e-4853-85f3-453fc41a930d"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.303660 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e709cdbe-6c8e-4853-85f3-453fc41a930d-kube-api-access-szthp" (OuterVolumeSpecName: "kube-api-access-szthp") pod "e709cdbe-6c8e-4853-85f3-453fc41a930d" (UID: "e709cdbe-6c8e-4853-85f3-453fc41a930d"). InnerVolumeSpecName "kube-api-access-szthp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.303796 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e709cdbe-6c8e-4853-85f3-453fc41a930d-builder-dockercfg-ptp88-push" (OuterVolumeSpecName: "builder-dockercfg-ptp88-push") pod "e709cdbe-6c8e-4853-85f3-453fc41a930d" (UID: "e709cdbe-6c8e-4853-85f3-453fc41a930d"). InnerVolumeSpecName "builder-dockercfg-ptp88-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.303847 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e709cdbe-6c8e-4853-85f3-453fc41a930d-builder-dockercfg-ptp88-pull" (OuterVolumeSpecName: "builder-dockercfg-ptp88-pull") pod "e709cdbe-6c8e-4853-85f3-453fc41a930d" (UID: "e709cdbe-6c8e-4853-85f3-453fc41a930d"). InnerVolumeSpecName "builder-dockercfg-ptp88-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.360498 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "e709cdbe-6c8e-4853-85f3-453fc41a930d" (UID: "e709cdbe-6c8e-4853-85f3-453fc41a930d"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.399192 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.399223 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.399232 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/e709cdbe-6c8e-4853-85f3-453fc41a930d-builder-dockercfg-ptp88-push\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.399239 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e709cdbe-6c8e-4853-85f3-453fc41a930d-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.399250 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.399259 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.399270 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.399277 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.399285 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/e709cdbe-6c8e-4853-85f3-453fc41a930d-builder-dockercfg-ptp88-pull\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.399293 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szthp\" (UniqueName: \"kubernetes.io/projected/e709cdbe-6c8e-4853-85f3-453fc41a930d-kube-api-access-szthp\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.655856 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "e709cdbe-6c8e-4853-85f3-453fc41a930d" (UID: "e709cdbe-6c8e-4853-85f3-453fc41a930d"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.703047 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.891175 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_e709cdbe-6c8e-4853-85f3-453fc41a930d/docker-build/0.log" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.891614 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"e709cdbe-6c8e-4853-85f3-453fc41a930d","Type":"ContainerDied","Data":"6aa4b25cf897f6651cfcca2cf0d7068ae5c2ea57809dec6519deb3bd9cef0432"} Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.891641 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6aa4b25cf897f6651cfcca2cf0d7068ae5c2ea57809dec6519deb3bd9cef0432" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.891702 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.938679 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.946379 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 08 00:28:46 crc kubenswrapper[4713]: I0308 00:28:46.552662 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e709cdbe-6c8e-4853-85f3-453fc41a930d" path="/var/lib/kubelet/pods/e709cdbe-6c8e-4853-85f3-453fc41a930d/volumes" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.544980 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 08 00:28:47 crc kubenswrapper[4713]: E0308 00:28:47.545232 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e709cdbe-6c8e-4853-85f3-453fc41a930d" containerName="manage-dockerfile" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.545247 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e709cdbe-6c8e-4853-85f3-453fc41a930d" containerName="manage-dockerfile" Mar 08 00:28:47 crc kubenswrapper[4713]: E0308 00:28:47.545269 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e709cdbe-6c8e-4853-85f3-453fc41a930d" containerName="docker-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.545278 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e709cdbe-6c8e-4853-85f3-453fc41a930d" containerName="docker-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.545420 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="e709cdbe-6c8e-4853-85f3-453fc41a930d" containerName="docker-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.546213 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.547833 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-sys-config" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.547882 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-ca" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.548688 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-global-ca" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.549465 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-ptp88" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.567103 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.625843 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.625897 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.625938 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.625977 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.626046 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85p55\" (UniqueName: \"kubernetes.io/projected/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-kube-api-access-85p55\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.626185 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.626254 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.626280 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.626329 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-builder-dockercfg-ptp88-pull\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.626378 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.626408 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-builder-dockercfg-ptp88-push\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.626426 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.727333 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-builder-dockercfg-ptp88-pull\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.727401 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.727418 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-builder-dockercfg-ptp88-push\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.727439 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.727459 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.727477 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.727502 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.727530 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.727547 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85p55\" (UniqueName: \"kubernetes.io/projected/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-kube-api-access-85p55\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.727563 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.727567 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.727590 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.727674 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.727810 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.727879 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.728689 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.728790 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.728893 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.729013 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.729057 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.729059 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.732305 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-builder-dockercfg-ptp88-pull\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.737621 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-builder-dockercfg-ptp88-push\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.744758 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85p55\" (UniqueName: \"kubernetes.io/projected/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-kube-api-access-85p55\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.860441 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:48 crc kubenswrapper[4713]: I0308 00:28:48.095131 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 08 00:28:48 crc kubenswrapper[4713]: I0308 00:28:48.913976 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"1c4738b4-e463-4bb9-a2dc-0a7861232c1d","Type":"ContainerStarted","Data":"e8197eadd8b22b6c38affe2ac83c08099c716aa9ad8e4e06b66822d8ef99c992"} Mar 08 00:28:48 crc kubenswrapper[4713]: I0308 00:28:48.914232 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"1c4738b4-e463-4bb9-a2dc-0a7861232c1d","Type":"ContainerStarted","Data":"a2520b9b3a4b16e4135496c66c365e09d95199d867e2488dcaa113a1fb909a14"} Mar 08 00:28:49 crc kubenswrapper[4713]: I0308 00:28:49.928201 4713 generic.go:334] "Generic (PLEG): container finished" podID="1c4738b4-e463-4bb9-a2dc-0a7861232c1d" containerID="e8197eadd8b22b6c38affe2ac83c08099c716aa9ad8e4e06b66822d8ef99c992" exitCode=0 Mar 08 00:28:49 crc kubenswrapper[4713]: I0308 00:28:49.928271 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"1c4738b4-e463-4bb9-a2dc-0a7861232c1d","Type":"ContainerDied","Data":"e8197eadd8b22b6c38affe2ac83c08099c716aa9ad8e4e06b66822d8ef99c992"} Mar 08 00:28:50 crc kubenswrapper[4713]: I0308 00:28:50.937664 4713 generic.go:334] "Generic (PLEG): container finished" podID="1c4738b4-e463-4bb9-a2dc-0a7861232c1d" containerID="4c71b1a2140085bec3748281dfbee0833851e2d135c6b9e429fa875adb54d2c5" exitCode=0 Mar 08 00:28:50 crc kubenswrapper[4713]: I0308 00:28:50.937713 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"1c4738b4-e463-4bb9-a2dc-0a7861232c1d","Type":"ContainerDied","Data":"4c71b1a2140085bec3748281dfbee0833851e2d135c6b9e429fa875adb54d2c5"} Mar 08 00:28:51 crc kubenswrapper[4713]: I0308 00:28:51.001885 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-2-build_1c4738b4-e463-4bb9-a2dc-0a7861232c1d/manage-dockerfile/0.log" Mar 08 00:28:51 crc kubenswrapper[4713]: I0308 00:28:51.946966 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"1c4738b4-e463-4bb9-a2dc-0a7861232c1d","Type":"ContainerStarted","Data":"a5f742a87c3757dc2da796fc52b28b2a5f71ef0f553420d2b30529401d1853b2"} Mar 08 00:28:51 crc kubenswrapper[4713]: I0308 00:28:51.976787 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-2-build" podStartSLOduration=4.976761364 podStartE2EDuration="4.976761364s" podCreationTimestamp="2026-03-08 00:28:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:28:51.971559427 +0000 UTC m=+1386.091191700" watchObservedRunningTime="2026-03-08 00:28:51.976761364 +0000 UTC m=+1386.096393637" Mar 08 00:29:04 crc kubenswrapper[4713]: I0308 00:29:04.500387 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:29:04 crc kubenswrapper[4713]: I0308 00:29:04.500925 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:29:32 crc kubenswrapper[4713]: I0308 00:29:32.220970 4713 generic.go:334] "Generic (PLEG): container finished" podID="1c4738b4-e463-4bb9-a2dc-0a7861232c1d" containerID="a5f742a87c3757dc2da796fc52b28b2a5f71ef0f553420d2b30529401d1853b2" exitCode=0 Mar 08 00:29:32 crc kubenswrapper[4713]: I0308 00:29:32.221009 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"1c4738b4-e463-4bb9-a2dc-0a7861232c1d","Type":"ContainerDied","Data":"a5f742a87c3757dc2da796fc52b28b2a5f71ef0f553420d2b30529401d1853b2"} Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.569379 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.725890 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85p55\" (UniqueName: \"kubernetes.io/projected/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-kube-api-access-85p55\") pod \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.725952 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-blob-cache\") pod \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.725991 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-node-pullsecrets\") pod \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.726032 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-container-storage-run\") pod \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.726057 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-proxy-ca-bundles\") pod \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.726078 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-builder-dockercfg-ptp88-pull\") pod \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.726054 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "1c4738b4-e463-4bb9-a2dc-0a7861232c1d" (UID: "1c4738b4-e463-4bb9-a2dc-0a7861232c1d"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.726126 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-container-storage-root\") pod \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.726170 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-buildworkdir\") pod \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.726191 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-buildcachedir\") pod \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.726230 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-builder-dockercfg-ptp88-push\") pod \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.726253 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-system-configs\") pod \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.726291 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-ca-bundles\") pod \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.726325 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "1c4738b4-e463-4bb9-a2dc-0a7861232c1d" (UID: "1c4738b4-e463-4bb9-a2dc-0a7861232c1d"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.726643 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.726659 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.726973 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "1c4738b4-e463-4bb9-a2dc-0a7861232c1d" (UID: "1c4738b4-e463-4bb9-a2dc-0a7861232c1d"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.727076 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "1c4738b4-e463-4bb9-a2dc-0a7861232c1d" (UID: "1c4738b4-e463-4bb9-a2dc-0a7861232c1d"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.727376 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "1c4738b4-e463-4bb9-a2dc-0a7861232c1d" (UID: "1c4738b4-e463-4bb9-a2dc-0a7861232c1d"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.727607 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "1c4738b4-e463-4bb9-a2dc-0a7861232c1d" (UID: "1c4738b4-e463-4bb9-a2dc-0a7861232c1d"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.729349 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "1c4738b4-e463-4bb9-a2dc-0a7861232c1d" (UID: "1c4738b4-e463-4bb9-a2dc-0a7861232c1d"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.733078 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-kube-api-access-85p55" (OuterVolumeSpecName: "kube-api-access-85p55") pod "1c4738b4-e463-4bb9-a2dc-0a7861232c1d" (UID: "1c4738b4-e463-4bb9-a2dc-0a7861232c1d"). InnerVolumeSpecName "kube-api-access-85p55". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.733514 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-builder-dockercfg-ptp88-pull" (OuterVolumeSpecName: "builder-dockercfg-ptp88-pull") pod "1c4738b4-e463-4bb9-a2dc-0a7861232c1d" (UID: "1c4738b4-e463-4bb9-a2dc-0a7861232c1d"). InnerVolumeSpecName "builder-dockercfg-ptp88-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.734083 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-builder-dockercfg-ptp88-push" (OuterVolumeSpecName: "builder-dockercfg-ptp88-push") pod "1c4738b4-e463-4bb9-a2dc-0a7861232c1d" (UID: "1c4738b4-e463-4bb9-a2dc-0a7861232c1d"). InnerVolumeSpecName "builder-dockercfg-ptp88-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.827514 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85p55\" (UniqueName: \"kubernetes.io/projected/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-kube-api-access-85p55\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.827550 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.827560 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.827569 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-builder-dockercfg-ptp88-pull\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.827581 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.827595 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-builder-dockercfg-ptp88-push\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.827603 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.827612 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.862332 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "1c4738b4-e463-4bb9-a2dc-0a7861232c1d" (UID: "1c4738b4-e463-4bb9-a2dc-0a7861232c1d"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.929268 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:34 crc kubenswrapper[4713]: I0308 00:29:34.239887 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 08 00:29:34 crc kubenswrapper[4713]: I0308 00:29:34.239879 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"1c4738b4-e463-4bb9-a2dc-0a7861232c1d","Type":"ContainerDied","Data":"a2520b9b3a4b16e4135496c66c365e09d95199d867e2488dcaa113a1fb909a14"} Mar 08 00:29:34 crc kubenswrapper[4713]: I0308 00:29:34.240037 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2520b9b3a4b16e4135496c66c365e09d95199d867e2488dcaa113a1fb909a14" Mar 08 00:29:34 crc kubenswrapper[4713]: I0308 00:29:34.442541 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "1c4738b4-e463-4bb9-a2dc-0a7861232c1d" (UID: "1c4738b4-e463-4bb9-a2dc-0a7861232c1d"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:29:34 crc kubenswrapper[4713]: I0308 00:29:34.501462 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:29:34 crc kubenswrapper[4713]: I0308 00:29:34.501526 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:29:34 crc kubenswrapper[4713]: I0308 00:29:34.538200 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.299492 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 08 00:29:37 crc kubenswrapper[4713]: E0308 00:29:37.300072 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c4738b4-e463-4bb9-a2dc-0a7861232c1d" containerName="git-clone" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.300088 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c4738b4-e463-4bb9-a2dc-0a7861232c1d" containerName="git-clone" Mar 08 00:29:37 crc kubenswrapper[4713]: E0308 00:29:37.300105 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c4738b4-e463-4bb9-a2dc-0a7861232c1d" containerName="manage-dockerfile" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.300114 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c4738b4-e463-4bb9-a2dc-0a7861232c1d" containerName="manage-dockerfile" Mar 08 00:29:37 crc kubenswrapper[4713]: E0308 00:29:37.300127 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c4738b4-e463-4bb9-a2dc-0a7861232c1d" containerName="docker-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.300136 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c4738b4-e463-4bb9-a2dc-0a7861232c1d" containerName="docker-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.300274 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c4738b4-e463-4bb9-a2dc-0a7861232c1d" containerName="docker-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.301018 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.305251 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-ca" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.305322 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-global-ca" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.305489 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-sys-config" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.305779 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-ptp88" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.314908 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.476568 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwpf6\" (UniqueName: \"kubernetes.io/projected/88dd7370-e036-44f4-906c-a03f3798ee7f-kube-api-access-vwpf6\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.476619 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.476649 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/88dd7370-e036-44f4-906c-a03f3798ee7f-builder-dockercfg-ptp88-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.476681 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.476697 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/88dd7370-e036-44f4-906c-a03f3798ee7f-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.476779 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/88dd7370-e036-44f4-906c-a03f3798ee7f-builder-dockercfg-ptp88-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.476875 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/88dd7370-e036-44f4-906c-a03f3798ee7f-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.476907 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.476932 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.476954 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.477035 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.477057 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.578179 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.578244 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.578279 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.578376 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.578414 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.578461 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwpf6\" (UniqueName: \"kubernetes.io/projected/88dd7370-e036-44f4-906c-a03f3798ee7f-kube-api-access-vwpf6\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.578502 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.578552 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/88dd7370-e036-44f4-906c-a03f3798ee7f-builder-dockercfg-ptp88-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.578619 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.578651 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/88dd7370-e036-44f4-906c-a03f3798ee7f-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.578682 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/88dd7370-e036-44f4-906c-a03f3798ee7f-builder-dockercfg-ptp88-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.578738 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/88dd7370-e036-44f4-906c-a03f3798ee7f-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.578874 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/88dd7370-e036-44f4-906c-a03f3798ee7f-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.578962 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/88dd7370-e036-44f4-906c-a03f3798ee7f-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.579160 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.579321 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.579429 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.579606 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.579720 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.579717 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.579813 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.586420 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/88dd7370-e036-44f4-906c-a03f3798ee7f-builder-dockercfg-ptp88-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.587368 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/88dd7370-e036-44f4-906c-a03f3798ee7f-builder-dockercfg-ptp88-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.599035 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwpf6\" (UniqueName: \"kubernetes.io/projected/88dd7370-e036-44f4-906c-a03f3798ee7f-kube-api-access-vwpf6\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.614258 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:38 crc kubenswrapper[4713]: I0308 00:29:38.009109 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 08 00:29:38 crc kubenswrapper[4713]: I0308 00:29:38.272422 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"88dd7370-e036-44f4-906c-a03f3798ee7f","Type":"ContainerStarted","Data":"7d3540bca41e1d46698b63f4f058eb4474315d57b0b021937080565a780badb3"} Mar 08 00:29:39 crc kubenswrapper[4713]: I0308 00:29:39.278885 4713 generic.go:334] "Generic (PLEG): container finished" podID="88dd7370-e036-44f4-906c-a03f3798ee7f" containerID="75f326f596c8074d0e0004f8348e7fb30d0d25afd989dc3fd48ceff0a95f0e78" exitCode=0 Mar 08 00:29:39 crc kubenswrapper[4713]: I0308 00:29:39.278938 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"88dd7370-e036-44f4-906c-a03f3798ee7f","Type":"ContainerDied","Data":"75f326f596c8074d0e0004f8348e7fb30d0d25afd989dc3fd48ceff0a95f0e78"} Mar 08 00:29:40 crc kubenswrapper[4713]: I0308 00:29:40.288450 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"88dd7370-e036-44f4-906c-a03f3798ee7f","Type":"ContainerStarted","Data":"2eff2ac31edbd43932db6de20f228de707d0a6a6b091aefa358db6b0a6ac4bbc"} Mar 08 00:29:40 crc kubenswrapper[4713]: I0308 00:29:40.316293 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-1-build" podStartSLOduration=3.316258436 podStartE2EDuration="3.316258436s" podCreationTimestamp="2026-03-08 00:29:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:29:40.310408282 +0000 UTC m=+1434.430040515" watchObservedRunningTime="2026-03-08 00:29:40.316258436 +0000 UTC m=+1434.435890689" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.024751 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.025577 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/prometheus-webhook-snmp-1-build" podUID="88dd7370-e036-44f4-906c-a03f3798ee7f" containerName="docker-build" containerID="cri-o://2eff2ac31edbd43932db6de20f228de707d0a6a6b091aefa358db6b0a6ac4bbc" gracePeriod=30 Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.342042 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_88dd7370-e036-44f4-906c-a03f3798ee7f/docker-build/0.log" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.342764 4713 generic.go:334] "Generic (PLEG): container finished" podID="88dd7370-e036-44f4-906c-a03f3798ee7f" containerID="2eff2ac31edbd43932db6de20f228de707d0a6a6b091aefa358db6b0a6ac4bbc" exitCode=1 Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.342803 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"88dd7370-e036-44f4-906c-a03f3798ee7f","Type":"ContainerDied","Data":"2eff2ac31edbd43932db6de20f228de707d0a6a6b091aefa358db6b0a6ac4bbc"} Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.380000 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_88dd7370-e036-44f4-906c-a03f3798ee7f/docker-build/0.log" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.380591 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.521314 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwpf6\" (UniqueName: \"kubernetes.io/projected/88dd7370-e036-44f4-906c-a03f3798ee7f-kube-api-access-vwpf6\") pod \"88dd7370-e036-44f4-906c-a03f3798ee7f\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.521419 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/88dd7370-e036-44f4-906c-a03f3798ee7f-node-pullsecrets\") pod \"88dd7370-e036-44f4-906c-a03f3798ee7f\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.521465 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-ca-bundles\") pod \"88dd7370-e036-44f4-906c-a03f3798ee7f\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.521543 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-container-storage-root\") pod \"88dd7370-e036-44f4-906c-a03f3798ee7f\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.521589 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/88dd7370-e036-44f4-906c-a03f3798ee7f-buildcachedir\") pod \"88dd7370-e036-44f4-906c-a03f3798ee7f\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.521641 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88dd7370-e036-44f4-906c-a03f3798ee7f-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "88dd7370-e036-44f4-906c-a03f3798ee7f" (UID: "88dd7370-e036-44f4-906c-a03f3798ee7f"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.521707 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88dd7370-e036-44f4-906c-a03f3798ee7f-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "88dd7370-e036-44f4-906c-a03f3798ee7f" (UID: "88dd7370-e036-44f4-906c-a03f3798ee7f"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.522239 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "88dd7370-e036-44f4-906c-a03f3798ee7f" (UID: "88dd7370-e036-44f4-906c-a03f3798ee7f"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.522371 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/88dd7370-e036-44f4-906c-a03f3798ee7f-builder-dockercfg-ptp88-pull\") pod \"88dd7370-e036-44f4-906c-a03f3798ee7f\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.522438 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-build-blob-cache\") pod \"88dd7370-e036-44f4-906c-a03f3798ee7f\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.522477 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-system-configs\") pod \"88dd7370-e036-44f4-906c-a03f3798ee7f\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.522543 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/88dd7370-e036-44f4-906c-a03f3798ee7f-builder-dockercfg-ptp88-push\") pod \"88dd7370-e036-44f4-906c-a03f3798ee7f\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.522615 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-proxy-ca-bundles\") pod \"88dd7370-e036-44f4-906c-a03f3798ee7f\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.522765 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-buildworkdir\") pod \"88dd7370-e036-44f4-906c-a03f3798ee7f\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.522814 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-container-storage-run\") pod \"88dd7370-e036-44f4-906c-a03f3798ee7f\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.523228 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/88dd7370-e036-44f4-906c-a03f3798ee7f-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.523247 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.523260 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/88dd7370-e036-44f4-906c-a03f3798ee7f-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.523421 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "88dd7370-e036-44f4-906c-a03f3798ee7f" (UID: "88dd7370-e036-44f4-906c-a03f3798ee7f"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.523725 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "88dd7370-e036-44f4-906c-a03f3798ee7f" (UID: "88dd7370-e036-44f4-906c-a03f3798ee7f"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.523780 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "88dd7370-e036-44f4-906c-a03f3798ee7f" (UID: "88dd7370-e036-44f4-906c-a03f3798ee7f"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.524538 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "88dd7370-e036-44f4-906c-a03f3798ee7f" (UID: "88dd7370-e036-44f4-906c-a03f3798ee7f"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.527487 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88dd7370-e036-44f4-906c-a03f3798ee7f-builder-dockercfg-ptp88-push" (OuterVolumeSpecName: "builder-dockercfg-ptp88-push") pod "88dd7370-e036-44f4-906c-a03f3798ee7f" (UID: "88dd7370-e036-44f4-906c-a03f3798ee7f"). InnerVolumeSpecName "builder-dockercfg-ptp88-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.527512 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88dd7370-e036-44f4-906c-a03f3798ee7f-kube-api-access-vwpf6" (OuterVolumeSpecName: "kube-api-access-vwpf6") pod "88dd7370-e036-44f4-906c-a03f3798ee7f" (UID: "88dd7370-e036-44f4-906c-a03f3798ee7f"). InnerVolumeSpecName "kube-api-access-vwpf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.527863 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88dd7370-e036-44f4-906c-a03f3798ee7f-builder-dockercfg-ptp88-pull" (OuterVolumeSpecName: "builder-dockercfg-ptp88-pull") pod "88dd7370-e036-44f4-906c-a03f3798ee7f" (UID: "88dd7370-e036-44f4-906c-a03f3798ee7f"). InnerVolumeSpecName "builder-dockercfg-ptp88-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.591359 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "88dd7370-e036-44f4-906c-a03f3798ee7f" (UID: "88dd7370-e036-44f4-906c-a03f3798ee7f"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.624320 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.624353 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.624363 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.624373 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwpf6\" (UniqueName: \"kubernetes.io/projected/88dd7370-e036-44f4-906c-a03f3798ee7f-kube-api-access-vwpf6\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.624383 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/88dd7370-e036-44f4-906c-a03f3798ee7f-builder-dockercfg-ptp88-pull\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.624392 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.624403 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.624414 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/88dd7370-e036-44f4-906c-a03f3798ee7f-builder-dockercfg-ptp88-push\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.883678 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "88dd7370-e036-44f4-906c-a03f3798ee7f" (UID: "88dd7370-e036-44f4-906c-a03f3798ee7f"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.928545 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.351231 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_88dd7370-e036-44f4-906c-a03f3798ee7f/docker-build/0.log" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.352309 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"88dd7370-e036-44f4-906c-a03f3798ee7f","Type":"ContainerDied","Data":"7d3540bca41e1d46698b63f4f058eb4474315d57b0b021937080565a780badb3"} Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.352353 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.352386 4713 scope.go:117] "RemoveContainer" containerID="2eff2ac31edbd43932db6de20f228de707d0a6a6b091aefa358db6b0a6ac4bbc" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.386548 4713 scope.go:117] "RemoveContainer" containerID="75f326f596c8074d0e0004f8348e7fb30d0d25afd989dc3fd48ceff0a95f0e78" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.404930 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.409448 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.674297 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 08 00:29:49 crc kubenswrapper[4713]: E0308 00:29:49.674624 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88dd7370-e036-44f4-906c-a03f3798ee7f" containerName="docker-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.674645 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="88dd7370-e036-44f4-906c-a03f3798ee7f" containerName="docker-build" Mar 08 00:29:49 crc kubenswrapper[4713]: E0308 00:29:49.674670 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88dd7370-e036-44f4-906c-a03f3798ee7f" containerName="manage-dockerfile" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.674683 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="88dd7370-e036-44f4-906c-a03f3798ee7f" containerName="manage-dockerfile" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.674930 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="88dd7370-e036-44f4-906c-a03f3798ee7f" containerName="docker-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.676788 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.679551 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-ptp88" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.680682 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-sys-config" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.680720 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-global-ca" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.681036 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-ca" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.690238 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.839128 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.839294 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.839348 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/88b0640d-1c8b-4309-bce2-011f21f4578c-builder-dockercfg-ptp88-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.839429 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/88b0640d-1c8b-4309-bce2-011f21f4578c-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.839495 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.839555 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.839658 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/88b0640d-1c8b-4309-bce2-011f21f4578c-builder-dockercfg-ptp88-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.839895 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.840043 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/88b0640d-1c8b-4309-bce2-011f21f4578c-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.840133 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.840201 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.840302 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8flb\" (UniqueName: \"kubernetes.io/projected/88b0640d-1c8b-4309-bce2-011f21f4578c-kube-api-access-s8flb\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.941347 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/88b0640d-1c8b-4309-bce2-011f21f4578c-builder-dockercfg-ptp88-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.941410 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.941443 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/88b0640d-1c8b-4309-bce2-011f21f4578c-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.941460 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.941486 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/88b0640d-1c8b-4309-bce2-011f21f4578c-builder-dockercfg-ptp88-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.941508 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.941528 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/88b0640d-1c8b-4309-bce2-011f21f4578c-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.941543 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.941565 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.941581 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8flb\" (UniqueName: \"kubernetes.io/projected/88b0640d-1c8b-4309-bce2-011f21f4578c-kube-api-access-s8flb\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.941618 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.941643 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.941712 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/88b0640d-1c8b-4309-bce2-011f21f4578c-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.942035 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.942269 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/88b0640d-1c8b-4309-bce2-011f21f4578c-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.942304 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.942597 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.942727 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.942739 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.942932 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.943023 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.948529 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/88b0640d-1c8b-4309-bce2-011f21f4578c-builder-dockercfg-ptp88-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.948673 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/88b0640d-1c8b-4309-bce2-011f21f4578c-builder-dockercfg-ptp88-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.958644 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8flb\" (UniqueName: \"kubernetes.io/projected/88b0640d-1c8b-4309-bce2-011f21f4578c-kube-api-access-s8flb\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:50 crc kubenswrapper[4713]: I0308 00:29:50.028550 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:50 crc kubenswrapper[4713]: I0308 00:29:50.462051 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 08 00:29:50 crc kubenswrapper[4713]: I0308 00:29:50.551846 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88dd7370-e036-44f4-906c-a03f3798ee7f" path="/var/lib/kubelet/pods/88dd7370-e036-44f4-906c-a03f3798ee7f/volumes" Mar 08 00:29:51 crc kubenswrapper[4713]: I0308 00:29:51.372919 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"88b0640d-1c8b-4309-bce2-011f21f4578c","Type":"ContainerStarted","Data":"370db0d532375790e272a8831b8b06a7bffa1a6206bbbefdccfe8ec69ece8ac8"} Mar 08 00:29:51 crc kubenswrapper[4713]: I0308 00:29:51.373264 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"88b0640d-1c8b-4309-bce2-011f21f4578c","Type":"ContainerStarted","Data":"8ce83e4ab9056a34d36195fcc4e1477a5d85172933c94b6817b335d367e82a90"} Mar 08 00:29:52 crc kubenswrapper[4713]: I0308 00:29:52.382391 4713 generic.go:334] "Generic (PLEG): container finished" podID="88b0640d-1c8b-4309-bce2-011f21f4578c" containerID="370db0d532375790e272a8831b8b06a7bffa1a6206bbbefdccfe8ec69ece8ac8" exitCode=0 Mar 08 00:29:52 crc kubenswrapper[4713]: I0308 00:29:52.382432 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"88b0640d-1c8b-4309-bce2-011f21f4578c","Type":"ContainerDied","Data":"370db0d532375790e272a8831b8b06a7bffa1a6206bbbefdccfe8ec69ece8ac8"} Mar 08 00:29:53 crc kubenswrapper[4713]: I0308 00:29:53.389906 4713 generic.go:334] "Generic (PLEG): container finished" podID="88b0640d-1c8b-4309-bce2-011f21f4578c" containerID="769ad33b4491cfff20c31acfa1bd44ca25f93ee045dbe8cee23176ab67a67457" exitCode=0 Mar 08 00:29:53 crc kubenswrapper[4713]: I0308 00:29:53.390019 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"88b0640d-1c8b-4309-bce2-011f21f4578c","Type":"ContainerDied","Data":"769ad33b4491cfff20c31acfa1bd44ca25f93ee045dbe8cee23176ab67a67457"} Mar 08 00:29:53 crc kubenswrapper[4713]: I0308 00:29:53.431580 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-2-build_88b0640d-1c8b-4309-bce2-011f21f4578c/manage-dockerfile/0.log" Mar 08 00:29:54 crc kubenswrapper[4713]: I0308 00:29:54.402244 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"88b0640d-1c8b-4309-bce2-011f21f4578c","Type":"ContainerStarted","Data":"ac8230c8632760ddb3ac19a198cfc4522ce2a67e22d1b6707a6f5ecde314ae5d"} Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.128522 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-2-build" podStartSLOduration=11.128502931 podStartE2EDuration="11.128502931s" podCreationTimestamp="2026-03-08 00:29:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:29:54.438041078 +0000 UTC m=+1448.557673321" watchObservedRunningTime="2026-03-08 00:30:00.128502931 +0000 UTC m=+1454.248135164" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.134111 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548830-csc8c"] Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.135014 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548830-csc8c" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.138327 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.138905 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.141722 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548830-csc8c"] Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.143070 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.236403 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn"] Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.237496 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.240379 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.244150 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.248380 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn"] Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.272409 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5hvn\" (UniqueName: \"kubernetes.io/projected/2b849b06-281c-44be-a061-ca5b3905b3e1-kube-api-access-z5hvn\") pod \"auto-csr-approver-29548830-csc8c\" (UID: \"2b849b06-281c-44be-a061-ca5b3905b3e1\") " pod="openshift-infra/auto-csr-approver-29548830-csc8c" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.373784 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-config-volume\") pod \"collect-profiles-29548830-rntpn\" (UID: \"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.373930 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx6pb\" (UniqueName: \"kubernetes.io/projected/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-kube-api-access-vx6pb\") pod \"collect-profiles-29548830-rntpn\" (UID: \"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.373959 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5hvn\" (UniqueName: \"kubernetes.io/projected/2b849b06-281c-44be-a061-ca5b3905b3e1-kube-api-access-z5hvn\") pod \"auto-csr-approver-29548830-csc8c\" (UID: \"2b849b06-281c-44be-a061-ca5b3905b3e1\") " pod="openshift-infra/auto-csr-approver-29548830-csc8c" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.374000 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-secret-volume\") pod \"collect-profiles-29548830-rntpn\" (UID: \"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.397717 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5hvn\" (UniqueName: \"kubernetes.io/projected/2b849b06-281c-44be-a061-ca5b3905b3e1-kube-api-access-z5hvn\") pod \"auto-csr-approver-29548830-csc8c\" (UID: \"2b849b06-281c-44be-a061-ca5b3905b3e1\") " pod="openshift-infra/auto-csr-approver-29548830-csc8c" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.452425 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548830-csc8c" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.475265 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-secret-volume\") pod \"collect-profiles-29548830-rntpn\" (UID: \"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.475323 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-config-volume\") pod \"collect-profiles-29548830-rntpn\" (UID: \"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.475366 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx6pb\" (UniqueName: \"kubernetes.io/projected/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-kube-api-access-vx6pb\") pod \"collect-profiles-29548830-rntpn\" (UID: \"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.476333 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-config-volume\") pod \"collect-profiles-29548830-rntpn\" (UID: \"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.479418 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-secret-volume\") pod \"collect-profiles-29548830-rntpn\" (UID: \"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.494563 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx6pb\" (UniqueName: \"kubernetes.io/projected/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-kube-api-access-vx6pb\") pod \"collect-profiles-29548830-rntpn\" (UID: \"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.554214 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.826691 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548830-csc8c"] Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.935960 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn"] Mar 08 00:30:01 crc kubenswrapper[4713]: I0308 00:30:01.462138 4713 generic.go:334] "Generic (PLEG): container finished" podID="2bcfc109-be57-4b72-a9a2-7a7a735bbd1a" containerID="d4b4e5eefafcfeaee5c9f5d40fb853163f9b52c4141752f9abd458d295d15c7b" exitCode=0 Mar 08 00:30:01 crc kubenswrapper[4713]: I0308 00:30:01.462209 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn" event={"ID":"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a","Type":"ContainerDied","Data":"d4b4e5eefafcfeaee5c9f5d40fb853163f9b52c4141752f9abd458d295d15c7b"} Mar 08 00:30:01 crc kubenswrapper[4713]: I0308 00:30:01.462508 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn" event={"ID":"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a","Type":"ContainerStarted","Data":"a58d30d9343e00de15613956067fd9d24207e80c80ac1f4f3bf1ed2d3a133693"} Mar 08 00:30:01 crc kubenswrapper[4713]: I0308 00:30:01.463611 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548830-csc8c" event={"ID":"2b849b06-281c-44be-a061-ca5b3905b3e1","Type":"ContainerStarted","Data":"5be7eb95db2318c93c52ad0aca0a0a0290dd335215e1095f65cf0bd15245c316"} Mar 08 00:30:02 crc kubenswrapper[4713]: I0308 00:30:02.480655 4713 generic.go:334] "Generic (PLEG): container finished" podID="2b849b06-281c-44be-a061-ca5b3905b3e1" containerID="5ffd3bb6cf22ba954a7e67226be2ca668fd3bb44939915e41b40c3c5cd452879" exitCode=0 Mar 08 00:30:02 crc kubenswrapper[4713]: I0308 00:30:02.480721 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548830-csc8c" event={"ID":"2b849b06-281c-44be-a061-ca5b3905b3e1","Type":"ContainerDied","Data":"5ffd3bb6cf22ba954a7e67226be2ca668fd3bb44939915e41b40c3c5cd452879"} Mar 08 00:30:02 crc kubenswrapper[4713]: I0308 00:30:02.761531 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn" Mar 08 00:30:02 crc kubenswrapper[4713]: I0308 00:30:02.904344 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-secret-volume\") pod \"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a\" (UID: \"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a\") " Mar 08 00:30:02 crc kubenswrapper[4713]: I0308 00:30:02.904413 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-config-volume\") pod \"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a\" (UID: \"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a\") " Mar 08 00:30:02 crc kubenswrapper[4713]: I0308 00:30:02.904470 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx6pb\" (UniqueName: \"kubernetes.io/projected/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-kube-api-access-vx6pb\") pod \"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a\" (UID: \"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a\") " Mar 08 00:30:02 crc kubenswrapper[4713]: I0308 00:30:02.905278 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-config-volume" (OuterVolumeSpecName: "config-volume") pod "2bcfc109-be57-4b72-a9a2-7a7a735bbd1a" (UID: "2bcfc109-be57-4b72-a9a2-7a7a735bbd1a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:30:02 crc kubenswrapper[4713]: I0308 00:30:02.910003 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2bcfc109-be57-4b72-a9a2-7a7a735bbd1a" (UID: "2bcfc109-be57-4b72-a9a2-7a7a735bbd1a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:30:02 crc kubenswrapper[4713]: I0308 00:30:02.910043 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-kube-api-access-vx6pb" (OuterVolumeSpecName: "kube-api-access-vx6pb") pod "2bcfc109-be57-4b72-a9a2-7a7a735bbd1a" (UID: "2bcfc109-be57-4b72-a9a2-7a7a735bbd1a"). InnerVolumeSpecName "kube-api-access-vx6pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:30:03 crc kubenswrapper[4713]: I0308 00:30:03.005413 4713 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:03 crc kubenswrapper[4713]: I0308 00:30:03.005449 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vx6pb\" (UniqueName: \"kubernetes.io/projected/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-kube-api-access-vx6pb\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:03 crc kubenswrapper[4713]: I0308 00:30:03.005460 4713 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:03 crc kubenswrapper[4713]: I0308 00:30:03.488069 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn" event={"ID":"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a","Type":"ContainerDied","Data":"a58d30d9343e00de15613956067fd9d24207e80c80ac1f4f3bf1ed2d3a133693"} Mar 08 00:30:03 crc kubenswrapper[4713]: I0308 00:30:03.488417 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a58d30d9343e00de15613956067fd9d24207e80c80ac1f4f3bf1ed2d3a133693" Mar 08 00:30:03 crc kubenswrapper[4713]: I0308 00:30:03.488084 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn" Mar 08 00:30:03 crc kubenswrapper[4713]: I0308 00:30:03.736839 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548830-csc8c" Mar 08 00:30:03 crc kubenswrapper[4713]: I0308 00:30:03.917171 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5hvn\" (UniqueName: \"kubernetes.io/projected/2b849b06-281c-44be-a061-ca5b3905b3e1-kube-api-access-z5hvn\") pod \"2b849b06-281c-44be-a061-ca5b3905b3e1\" (UID: \"2b849b06-281c-44be-a061-ca5b3905b3e1\") " Mar 08 00:30:03 crc kubenswrapper[4713]: I0308 00:30:03.922541 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b849b06-281c-44be-a061-ca5b3905b3e1-kube-api-access-z5hvn" (OuterVolumeSpecName: "kube-api-access-z5hvn") pod "2b849b06-281c-44be-a061-ca5b3905b3e1" (UID: "2b849b06-281c-44be-a061-ca5b3905b3e1"). InnerVolumeSpecName "kube-api-access-z5hvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:30:04 crc kubenswrapper[4713]: I0308 00:30:04.018849 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5hvn\" (UniqueName: \"kubernetes.io/projected/2b849b06-281c-44be-a061-ca5b3905b3e1-kube-api-access-z5hvn\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:04 crc kubenswrapper[4713]: I0308 00:30:04.495758 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548830-csc8c" event={"ID":"2b849b06-281c-44be-a061-ca5b3905b3e1","Type":"ContainerDied","Data":"5be7eb95db2318c93c52ad0aca0a0a0290dd335215e1095f65cf0bd15245c316"} Mar 08 00:30:04 crc kubenswrapper[4713]: I0308 00:30:04.495800 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5be7eb95db2318c93c52ad0aca0a0a0290dd335215e1095f65cf0bd15245c316" Mar 08 00:30:04 crc kubenswrapper[4713]: I0308 00:30:04.495883 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548830-csc8c" Mar 08 00:30:04 crc kubenswrapper[4713]: I0308 00:30:04.500286 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:30:04 crc kubenswrapper[4713]: I0308 00:30:04.500362 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:30:04 crc kubenswrapper[4713]: I0308 00:30:04.500431 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:30:04 crc kubenswrapper[4713]: I0308 00:30:04.501106 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bbcc55077b8279f43ab1318272be3487b4b4457dea7182ea0e9d79f49619de4c"} pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 00:30:04 crc kubenswrapper[4713]: I0308 00:30:04.501175 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" containerID="cri-o://bbcc55077b8279f43ab1318272be3487b4b4457dea7182ea0e9d79f49619de4c" gracePeriod=600 Mar 08 00:30:04 crc kubenswrapper[4713]: I0308 00:30:04.826973 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29548824-mrbjn"] Mar 08 00:30:04 crc kubenswrapper[4713]: I0308 00:30:04.832247 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29548824-mrbjn"] Mar 08 00:30:05 crc kubenswrapper[4713]: I0308 00:30:05.504283 4713 generic.go:334] "Generic (PLEG): container finished" podID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerID="bbcc55077b8279f43ab1318272be3487b4b4457dea7182ea0e9d79f49619de4c" exitCode=0 Mar 08 00:30:05 crc kubenswrapper[4713]: I0308 00:30:05.504465 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerDied","Data":"bbcc55077b8279f43ab1318272be3487b4b4457dea7182ea0e9d79f49619de4c"} Mar 08 00:30:05 crc kubenswrapper[4713]: I0308 00:30:05.504587 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerStarted","Data":"013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b"} Mar 08 00:30:05 crc kubenswrapper[4713]: I0308 00:30:05.504609 4713 scope.go:117] "RemoveContainer" containerID="c9719f0bfb278b285d17679470509ae6172a8ecfd762a13c6a85c14fdaf89f7f" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.357661 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-69sgm"] Mar 08 00:30:06 crc kubenswrapper[4713]: E0308 00:30:06.358046 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b849b06-281c-44be-a061-ca5b3905b3e1" containerName="oc" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.358072 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b849b06-281c-44be-a061-ca5b3905b3e1" containerName="oc" Mar 08 00:30:06 crc kubenswrapper[4713]: E0308 00:30:06.358103 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bcfc109-be57-4b72-a9a2-7a7a735bbd1a" containerName="collect-profiles" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.358114 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bcfc109-be57-4b72-a9a2-7a7a735bbd1a" containerName="collect-profiles" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.358249 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bcfc109-be57-4b72-a9a2-7a7a735bbd1a" containerName="collect-profiles" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.358275 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b849b06-281c-44be-a061-ca5b3905b3e1" containerName="oc" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.359471 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.374968 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-69sgm"] Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.548716 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42829204-3911-4926-bcab-0e8f7b731986" path="/var/lib/kubelet/pods/42829204-3911-4926-bcab-0e8f7b731986/volumes" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.552712 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/957eea24-22ac-426b-abe9-996fdf130d19-catalog-content\") pod \"redhat-operators-69sgm\" (UID: \"957eea24-22ac-426b-abe9-996fdf130d19\") " pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.552792 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/957eea24-22ac-426b-abe9-996fdf130d19-utilities\") pod \"redhat-operators-69sgm\" (UID: \"957eea24-22ac-426b-abe9-996fdf130d19\") " pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.554412 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tgt5\" (UniqueName: \"kubernetes.io/projected/957eea24-22ac-426b-abe9-996fdf130d19-kube-api-access-7tgt5\") pod \"redhat-operators-69sgm\" (UID: \"957eea24-22ac-426b-abe9-996fdf130d19\") " pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.655335 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/957eea24-22ac-426b-abe9-996fdf130d19-catalog-content\") pod \"redhat-operators-69sgm\" (UID: \"957eea24-22ac-426b-abe9-996fdf130d19\") " pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.655387 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/957eea24-22ac-426b-abe9-996fdf130d19-utilities\") pod \"redhat-operators-69sgm\" (UID: \"957eea24-22ac-426b-abe9-996fdf130d19\") " pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.655478 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tgt5\" (UniqueName: \"kubernetes.io/projected/957eea24-22ac-426b-abe9-996fdf130d19-kube-api-access-7tgt5\") pod \"redhat-operators-69sgm\" (UID: \"957eea24-22ac-426b-abe9-996fdf130d19\") " pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.656073 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/957eea24-22ac-426b-abe9-996fdf130d19-utilities\") pod \"redhat-operators-69sgm\" (UID: \"957eea24-22ac-426b-abe9-996fdf130d19\") " pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.656090 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/957eea24-22ac-426b-abe9-996fdf130d19-catalog-content\") pod \"redhat-operators-69sgm\" (UID: \"957eea24-22ac-426b-abe9-996fdf130d19\") " pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.685679 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tgt5\" (UniqueName: \"kubernetes.io/projected/957eea24-22ac-426b-abe9-996fdf130d19-kube-api-access-7tgt5\") pod \"redhat-operators-69sgm\" (UID: \"957eea24-22ac-426b-abe9-996fdf130d19\") " pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.975869 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:07 crc kubenswrapper[4713]: I0308 00:30:07.224184 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-69sgm"] Mar 08 00:30:07 crc kubenswrapper[4713]: I0308 00:30:07.532842 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69sgm" event={"ID":"957eea24-22ac-426b-abe9-996fdf130d19","Type":"ContainerStarted","Data":"6ee1793a7046481e3b27dc2ceab643fe6b040c613a76144c47e73cc34a82299e"} Mar 08 00:30:07 crc kubenswrapper[4713]: I0308 00:30:07.532897 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69sgm" event={"ID":"957eea24-22ac-426b-abe9-996fdf130d19","Type":"ContainerStarted","Data":"c62b23f71ee51e106272541d3a540821e4dd4bad864cddd3f75035f7ae8459dd"} Mar 08 00:30:08 crc kubenswrapper[4713]: I0308 00:30:08.540995 4713 generic.go:334] "Generic (PLEG): container finished" podID="957eea24-22ac-426b-abe9-996fdf130d19" containerID="6ee1793a7046481e3b27dc2ceab643fe6b040c613a76144c47e73cc34a82299e" exitCode=0 Mar 08 00:30:08 crc kubenswrapper[4713]: I0308 00:30:08.551071 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69sgm" event={"ID":"957eea24-22ac-426b-abe9-996fdf130d19","Type":"ContainerDied","Data":"6ee1793a7046481e3b27dc2ceab643fe6b040c613a76144c47e73cc34a82299e"} Mar 08 00:30:09 crc kubenswrapper[4713]: I0308 00:30:09.550027 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69sgm" event={"ID":"957eea24-22ac-426b-abe9-996fdf130d19","Type":"ContainerStarted","Data":"6970978d105d04a32e067fed848c317ebfbf71c6d52fbb18b171f28aed0508e4"} Mar 08 00:30:10 crc kubenswrapper[4713]: I0308 00:30:10.557406 4713 generic.go:334] "Generic (PLEG): container finished" podID="957eea24-22ac-426b-abe9-996fdf130d19" containerID="6970978d105d04a32e067fed848c317ebfbf71c6d52fbb18b171f28aed0508e4" exitCode=0 Mar 08 00:30:10 crc kubenswrapper[4713]: I0308 00:30:10.557586 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69sgm" event={"ID":"957eea24-22ac-426b-abe9-996fdf130d19","Type":"ContainerDied","Data":"6970978d105d04a32e067fed848c317ebfbf71c6d52fbb18b171f28aed0508e4"} Mar 08 00:30:11 crc kubenswrapper[4713]: I0308 00:30:11.564671 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69sgm" event={"ID":"957eea24-22ac-426b-abe9-996fdf130d19","Type":"ContainerStarted","Data":"0b9ff58ca69afc612cfb0165be5630dc5ee51a065fa1714961148c4481c78a9c"} Mar 08 00:30:11 crc kubenswrapper[4713]: I0308 00:30:11.582077 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-69sgm" podStartSLOduration=3.054591415 podStartE2EDuration="5.58206015s" podCreationTimestamp="2026-03-08 00:30:06 +0000 UTC" firstStartedPulling="2026-03-08 00:30:08.543057799 +0000 UTC m=+1462.662690032" lastFinishedPulling="2026-03-08 00:30:11.070526544 +0000 UTC m=+1465.190158767" observedRunningTime="2026-03-08 00:30:11.580711555 +0000 UTC m=+1465.700343798" watchObservedRunningTime="2026-03-08 00:30:11.58206015 +0000 UTC m=+1465.701692383" Mar 08 00:30:15 crc kubenswrapper[4713]: I0308 00:30:15.200132 4713 scope.go:117] "RemoveContainer" containerID="5194adfd055d923428c5bad5d8993dba160fbbc540dca7c2cc8ef69daad1dbf4" Mar 08 00:30:16 crc kubenswrapper[4713]: I0308 00:30:16.976913 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:16 crc kubenswrapper[4713]: I0308 00:30:16.977197 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:18 crc kubenswrapper[4713]: I0308 00:30:18.036303 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-69sgm" podUID="957eea24-22ac-426b-abe9-996fdf130d19" containerName="registry-server" probeResult="failure" output=< Mar 08 00:30:18 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 08 00:30:18 crc kubenswrapper[4713]: > Mar 08 00:30:27 crc kubenswrapper[4713]: I0308 00:30:27.019452 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:27 crc kubenswrapper[4713]: I0308 00:30:27.059116 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:30 crc kubenswrapper[4713]: I0308 00:30:30.638202 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-69sgm"] Mar 08 00:30:30 crc kubenswrapper[4713]: I0308 00:30:30.639107 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-69sgm" podUID="957eea24-22ac-426b-abe9-996fdf130d19" containerName="registry-server" containerID="cri-o://0b9ff58ca69afc612cfb0165be5630dc5ee51a065fa1714961148c4481c78a9c" gracePeriod=2 Mar 08 00:30:31 crc kubenswrapper[4713]: I0308 00:30:31.096102 4713 generic.go:334] "Generic (PLEG): container finished" podID="957eea24-22ac-426b-abe9-996fdf130d19" containerID="0b9ff58ca69afc612cfb0165be5630dc5ee51a065fa1714961148c4481c78a9c" exitCode=0 Mar 08 00:30:31 crc kubenswrapper[4713]: I0308 00:30:31.096142 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69sgm" event={"ID":"957eea24-22ac-426b-abe9-996fdf130d19","Type":"ContainerDied","Data":"0b9ff58ca69afc612cfb0165be5630dc5ee51a065fa1714961148c4481c78a9c"} Mar 08 00:30:31 crc kubenswrapper[4713]: I0308 00:30:31.503721 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:31 crc kubenswrapper[4713]: I0308 00:30:31.608413 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/957eea24-22ac-426b-abe9-996fdf130d19-catalog-content\") pod \"957eea24-22ac-426b-abe9-996fdf130d19\" (UID: \"957eea24-22ac-426b-abe9-996fdf130d19\") " Mar 08 00:30:31 crc kubenswrapper[4713]: I0308 00:30:31.608451 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tgt5\" (UniqueName: \"kubernetes.io/projected/957eea24-22ac-426b-abe9-996fdf130d19-kube-api-access-7tgt5\") pod \"957eea24-22ac-426b-abe9-996fdf130d19\" (UID: \"957eea24-22ac-426b-abe9-996fdf130d19\") " Mar 08 00:30:31 crc kubenswrapper[4713]: I0308 00:30:31.608596 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/957eea24-22ac-426b-abe9-996fdf130d19-utilities\") pod \"957eea24-22ac-426b-abe9-996fdf130d19\" (UID: \"957eea24-22ac-426b-abe9-996fdf130d19\") " Mar 08 00:30:31 crc kubenswrapper[4713]: I0308 00:30:31.609766 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/957eea24-22ac-426b-abe9-996fdf130d19-utilities" (OuterVolumeSpecName: "utilities") pod "957eea24-22ac-426b-abe9-996fdf130d19" (UID: "957eea24-22ac-426b-abe9-996fdf130d19"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:30:31 crc kubenswrapper[4713]: I0308 00:30:31.615517 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/957eea24-22ac-426b-abe9-996fdf130d19-kube-api-access-7tgt5" (OuterVolumeSpecName: "kube-api-access-7tgt5") pod "957eea24-22ac-426b-abe9-996fdf130d19" (UID: "957eea24-22ac-426b-abe9-996fdf130d19"). InnerVolumeSpecName "kube-api-access-7tgt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:30:31 crc kubenswrapper[4713]: I0308 00:30:31.710664 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/957eea24-22ac-426b-abe9-996fdf130d19-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:31 crc kubenswrapper[4713]: I0308 00:30:31.710710 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tgt5\" (UniqueName: \"kubernetes.io/projected/957eea24-22ac-426b-abe9-996fdf130d19-kube-api-access-7tgt5\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:31 crc kubenswrapper[4713]: I0308 00:30:31.730301 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/957eea24-22ac-426b-abe9-996fdf130d19-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "957eea24-22ac-426b-abe9-996fdf130d19" (UID: "957eea24-22ac-426b-abe9-996fdf130d19"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:30:31 crc kubenswrapper[4713]: I0308 00:30:31.812189 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/957eea24-22ac-426b-abe9-996fdf130d19-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:32 crc kubenswrapper[4713]: I0308 00:30:32.105290 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69sgm" event={"ID":"957eea24-22ac-426b-abe9-996fdf130d19","Type":"ContainerDied","Data":"c62b23f71ee51e106272541d3a540821e4dd4bad864cddd3f75035f7ae8459dd"} Mar 08 00:30:32 crc kubenswrapper[4713]: I0308 00:30:32.105339 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:32 crc kubenswrapper[4713]: I0308 00:30:32.105379 4713 scope.go:117] "RemoveContainer" containerID="0b9ff58ca69afc612cfb0165be5630dc5ee51a065fa1714961148c4481c78a9c" Mar 08 00:30:32 crc kubenswrapper[4713]: I0308 00:30:32.132023 4713 scope.go:117] "RemoveContainer" containerID="6970978d105d04a32e067fed848c317ebfbf71c6d52fbb18b171f28aed0508e4" Mar 08 00:30:32 crc kubenswrapper[4713]: I0308 00:30:32.145888 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-69sgm"] Mar 08 00:30:32 crc kubenswrapper[4713]: I0308 00:30:32.150151 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-69sgm"] Mar 08 00:30:32 crc kubenswrapper[4713]: I0308 00:30:32.179233 4713 scope.go:117] "RemoveContainer" containerID="6ee1793a7046481e3b27dc2ceab643fe6b040c613a76144c47e73cc34a82299e" Mar 08 00:30:32 crc kubenswrapper[4713]: I0308 00:30:32.547916 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="957eea24-22ac-426b-abe9-996fdf130d19" path="/var/lib/kubelet/pods/957eea24-22ac-426b-abe9-996fdf130d19/volumes" Mar 08 00:30:45 crc kubenswrapper[4713]: I0308 00:30:45.193416 4713 generic.go:334] "Generic (PLEG): container finished" podID="88b0640d-1c8b-4309-bce2-011f21f4578c" containerID="ac8230c8632760ddb3ac19a198cfc4522ce2a67e22d1b6707a6f5ecde314ae5d" exitCode=0 Mar 08 00:30:45 crc kubenswrapper[4713]: I0308 00:30:45.193501 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"88b0640d-1c8b-4309-bce2-011f21f4578c","Type":"ContainerDied","Data":"ac8230c8632760ddb3ac19a198cfc4522ce2a67e22d1b6707a6f5ecde314ae5d"} Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.447351 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.602264 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/88b0640d-1c8b-4309-bce2-011f21f4578c-builder-dockercfg-ptp88-push\") pod \"88b0640d-1c8b-4309-bce2-011f21f4578c\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.602632 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-build-blob-cache\") pod \"88b0640d-1c8b-4309-bce2-011f21f4578c\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.602730 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-ca-bundles\") pod \"88b0640d-1c8b-4309-bce2-011f21f4578c\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.602798 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-container-storage-root\") pod \"88b0640d-1c8b-4309-bce2-011f21f4578c\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.602905 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-container-storage-run\") pod \"88b0640d-1c8b-4309-bce2-011f21f4578c\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.603796 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/88b0640d-1c8b-4309-bce2-011f21f4578c-builder-dockercfg-ptp88-pull\") pod \"88b0640d-1c8b-4309-bce2-011f21f4578c\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.603962 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-proxy-ca-bundles\") pod \"88b0640d-1c8b-4309-bce2-011f21f4578c\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.603644 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "88b0640d-1c8b-4309-bce2-011f21f4578c" (UID: "88b0640d-1c8b-4309-bce2-011f21f4578c"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.604014 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "88b0640d-1c8b-4309-bce2-011f21f4578c" (UID: "88b0640d-1c8b-4309-bce2-011f21f4578c"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.604054 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/88b0640d-1c8b-4309-bce2-011f21f4578c-node-pullsecrets\") pod \"88b0640d-1c8b-4309-bce2-011f21f4578c\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.604209 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88b0640d-1c8b-4309-bce2-011f21f4578c-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "88b0640d-1c8b-4309-bce2-011f21f4578c" (UID: "88b0640d-1c8b-4309-bce2-011f21f4578c"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.604420 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "88b0640d-1c8b-4309-bce2-011f21f4578c" (UID: "88b0640d-1c8b-4309-bce2-011f21f4578c"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.604656 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/88b0640d-1c8b-4309-bce2-011f21f4578c-buildcachedir\") pod \"88b0640d-1c8b-4309-bce2-011f21f4578c\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.604739 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88b0640d-1c8b-4309-bce2-011f21f4578c-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "88b0640d-1c8b-4309-bce2-011f21f4578c" (UID: "88b0640d-1c8b-4309-bce2-011f21f4578c"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.604840 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-buildworkdir\") pod \"88b0640d-1c8b-4309-bce2-011f21f4578c\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.604946 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8flb\" (UniqueName: \"kubernetes.io/projected/88b0640d-1c8b-4309-bce2-011f21f4578c-kube-api-access-s8flb\") pod \"88b0640d-1c8b-4309-bce2-011f21f4578c\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.605038 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-system-configs\") pod \"88b0640d-1c8b-4309-bce2-011f21f4578c\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.605319 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.605380 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.605443 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.605497 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/88b0640d-1c8b-4309-bce2-011f21f4578c-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.605602 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/88b0640d-1c8b-4309-bce2-011f21f4578c-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.605983 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "88b0640d-1c8b-4309-bce2-011f21f4578c" (UID: "88b0640d-1c8b-4309-bce2-011f21f4578c"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.606104 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "88b0640d-1c8b-4309-bce2-011f21f4578c" (UID: "88b0640d-1c8b-4309-bce2-011f21f4578c"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.607493 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88b0640d-1c8b-4309-bce2-011f21f4578c-builder-dockercfg-ptp88-pull" (OuterVolumeSpecName: "builder-dockercfg-ptp88-pull") pod "88b0640d-1c8b-4309-bce2-011f21f4578c" (UID: "88b0640d-1c8b-4309-bce2-011f21f4578c"). InnerVolumeSpecName "builder-dockercfg-ptp88-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.608376 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88b0640d-1c8b-4309-bce2-011f21f4578c-builder-dockercfg-ptp88-push" (OuterVolumeSpecName: "builder-dockercfg-ptp88-push") pod "88b0640d-1c8b-4309-bce2-011f21f4578c" (UID: "88b0640d-1c8b-4309-bce2-011f21f4578c"). InnerVolumeSpecName "builder-dockercfg-ptp88-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.610198 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88b0640d-1c8b-4309-bce2-011f21f4578c-kube-api-access-s8flb" (OuterVolumeSpecName: "kube-api-access-s8flb") pod "88b0640d-1c8b-4309-bce2-011f21f4578c" (UID: "88b0640d-1c8b-4309-bce2-011f21f4578c"). InnerVolumeSpecName "kube-api-access-s8flb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.706757 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/88b0640d-1c8b-4309-bce2-011f21f4578c-builder-dockercfg-ptp88-pull\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.707333 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.707405 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8flb\" (UniqueName: \"kubernetes.io/projected/88b0640d-1c8b-4309-bce2-011f21f4578c-kube-api-access-s8flb\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.707469 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.707528 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/88b0640d-1c8b-4309-bce2-011f21f4578c-builder-dockercfg-ptp88-push\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.713620 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "88b0640d-1c8b-4309-bce2-011f21f4578c" (UID: "88b0640d-1c8b-4309-bce2-011f21f4578c"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.808991 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.210569 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"88b0640d-1c8b-4309-bce2-011f21f4578c","Type":"ContainerDied","Data":"8ce83e4ab9056a34d36195fcc4e1477a5d85172933c94b6817b335d367e82a90"} Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.210608 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ce83e4ab9056a34d36195fcc4e1477a5d85172933c94b6817b335d367e82a90" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.210634 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.453137 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "88b0640d-1c8b-4309-bce2-011f21f4578c" (UID: "88b0640d-1c8b-4309-bce2-011f21f4578c"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.498627 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ntck8"] Mar 08 00:30:47 crc kubenswrapper[4713]: E0308 00:30:47.499193 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88b0640d-1c8b-4309-bce2-011f21f4578c" containerName="manage-dockerfile" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.499221 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="88b0640d-1c8b-4309-bce2-011f21f4578c" containerName="manage-dockerfile" Mar 08 00:30:47 crc kubenswrapper[4713]: E0308 00:30:47.499236 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="957eea24-22ac-426b-abe9-996fdf130d19" containerName="extract-utilities" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.499244 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="957eea24-22ac-426b-abe9-996fdf130d19" containerName="extract-utilities" Mar 08 00:30:47 crc kubenswrapper[4713]: E0308 00:30:47.499273 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="957eea24-22ac-426b-abe9-996fdf130d19" containerName="extract-content" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.499279 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="957eea24-22ac-426b-abe9-996fdf130d19" containerName="extract-content" Mar 08 00:30:47 crc kubenswrapper[4713]: E0308 00:30:47.499287 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88b0640d-1c8b-4309-bce2-011f21f4578c" containerName="docker-build" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.499294 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="88b0640d-1c8b-4309-bce2-011f21f4578c" containerName="docker-build" Mar 08 00:30:47 crc kubenswrapper[4713]: E0308 00:30:47.499305 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88b0640d-1c8b-4309-bce2-011f21f4578c" containerName="git-clone" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.499313 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="88b0640d-1c8b-4309-bce2-011f21f4578c" containerName="git-clone" Mar 08 00:30:47 crc kubenswrapper[4713]: E0308 00:30:47.499354 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="957eea24-22ac-426b-abe9-996fdf130d19" containerName="registry-server" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.499361 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="957eea24-22ac-426b-abe9-996fdf130d19" containerName="registry-server" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.499555 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="88b0640d-1c8b-4309-bce2-011f21f4578c" containerName="docker-build" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.499569 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="957eea24-22ac-426b-abe9-996fdf130d19" containerName="registry-server" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.500616 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.511729 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ntck8"] Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.521170 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6mb5\" (UniqueName: \"kubernetes.io/projected/e89ade9c-892d-466e-bfaa-f45237078d28-kube-api-access-j6mb5\") pod \"certified-operators-ntck8\" (UID: \"e89ade9c-892d-466e-bfaa-f45237078d28\") " pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.521497 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e89ade9c-892d-466e-bfaa-f45237078d28-utilities\") pod \"certified-operators-ntck8\" (UID: \"e89ade9c-892d-466e-bfaa-f45237078d28\") " pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.521633 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e89ade9c-892d-466e-bfaa-f45237078d28-catalog-content\") pod \"certified-operators-ntck8\" (UID: \"e89ade9c-892d-466e-bfaa-f45237078d28\") " pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.521871 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.624275 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6mb5\" (UniqueName: \"kubernetes.io/projected/e89ade9c-892d-466e-bfaa-f45237078d28-kube-api-access-j6mb5\") pod \"certified-operators-ntck8\" (UID: \"e89ade9c-892d-466e-bfaa-f45237078d28\") " pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.625694 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e89ade9c-892d-466e-bfaa-f45237078d28-utilities\") pod \"certified-operators-ntck8\" (UID: \"e89ade9c-892d-466e-bfaa-f45237078d28\") " pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.625807 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e89ade9c-892d-466e-bfaa-f45237078d28-catalog-content\") pod \"certified-operators-ntck8\" (UID: \"e89ade9c-892d-466e-bfaa-f45237078d28\") " pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.626210 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e89ade9c-892d-466e-bfaa-f45237078d28-utilities\") pod \"certified-operators-ntck8\" (UID: \"e89ade9c-892d-466e-bfaa-f45237078d28\") " pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.626297 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e89ade9c-892d-466e-bfaa-f45237078d28-catalog-content\") pod \"certified-operators-ntck8\" (UID: \"e89ade9c-892d-466e-bfaa-f45237078d28\") " pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.651364 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6mb5\" (UniqueName: \"kubernetes.io/projected/e89ade9c-892d-466e-bfaa-f45237078d28-kube-api-access-j6mb5\") pod \"certified-operators-ntck8\" (UID: \"e89ade9c-892d-466e-bfaa-f45237078d28\") " pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.819405 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:30:48 crc kubenswrapper[4713]: I0308 00:30:48.114920 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ntck8"] Mar 08 00:30:48 crc kubenswrapper[4713]: I0308 00:30:48.234244 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntck8" event={"ID":"e89ade9c-892d-466e-bfaa-f45237078d28","Type":"ContainerStarted","Data":"cb867c50000d8b99afc6f684e5f719dee646322428c1a08e8dabaed01ce9d7d2"} Mar 08 00:30:49 crc kubenswrapper[4713]: I0308 00:30:49.242210 4713 generic.go:334] "Generic (PLEG): container finished" podID="e89ade9c-892d-466e-bfaa-f45237078d28" containerID="5f03d3220ec8d81401dd84eb9a971c62c0f5e5428b77b5f03d27706e5b80e4a4" exitCode=0 Mar 08 00:30:49 crc kubenswrapper[4713]: I0308 00:30:49.242413 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntck8" event={"ID":"e89ade9c-892d-466e-bfaa-f45237078d28","Type":"ContainerDied","Data":"5f03d3220ec8d81401dd84eb9a971c62c0f5e5428b77b5f03d27706e5b80e4a4"} Mar 08 00:30:50 crc kubenswrapper[4713]: I0308 00:30:50.251571 4713 generic.go:334] "Generic (PLEG): container finished" podID="e89ade9c-892d-466e-bfaa-f45237078d28" containerID="27b86b2af86fd38b42111200b9e048dd6a75f0ef35854300e5908e9b58b8acff" exitCode=0 Mar 08 00:30:50 crc kubenswrapper[4713]: I0308 00:30:50.251695 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntck8" event={"ID":"e89ade9c-892d-466e-bfaa-f45237078d28","Type":"ContainerDied","Data":"27b86b2af86fd38b42111200b9e048dd6a75f0ef35854300e5908e9b58b8acff"} Mar 08 00:30:51 crc kubenswrapper[4713]: I0308 00:30:51.259942 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntck8" event={"ID":"e89ade9c-892d-466e-bfaa-f45237078d28","Type":"ContainerStarted","Data":"d282775e1e74077693a918e8838236835dd07b5d6bc464670e0fbcb348261c3d"} Mar 08 00:30:51 crc kubenswrapper[4713]: I0308 00:30:51.305636 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ntck8" podStartSLOduration=2.93329394 podStartE2EDuration="4.30561594s" podCreationTimestamp="2026-03-08 00:30:47 +0000 UTC" firstStartedPulling="2026-03-08 00:30:49.244570836 +0000 UTC m=+1503.364203069" lastFinishedPulling="2026-03-08 00:30:50.616892836 +0000 UTC m=+1504.736525069" observedRunningTime="2026-03-08 00:30:51.302683232 +0000 UTC m=+1505.422315465" watchObservedRunningTime="2026-03-08 00:30:51.30561594 +0000 UTC m=+1505.425248173" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.441177 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.442596 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.444368 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-global-ca" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.444768 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-ptp88" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.445350 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-ca" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.445363 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-sys-config" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.461749 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.526878 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzx9v\" (UniqueName: \"kubernetes.io/projected/d3c490af-d8bd-4659-b51d-6aec80c439c8-kube-api-access-gzx9v\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.526923 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/d3c490af-d8bd-4659-b51d-6aec80c439c8-builder-dockercfg-ptp88-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.526949 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.526969 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.526991 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.527011 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.527034 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/d3c490af-d8bd-4659-b51d-6aec80c439c8-builder-dockercfg-ptp88-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.527153 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.527269 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.527363 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.527446 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d3c490af-d8bd-4659-b51d-6aec80c439c8-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.527487 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d3c490af-d8bd-4659-b51d-6aec80c439c8-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.628698 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzx9v\" (UniqueName: \"kubernetes.io/projected/d3c490af-d8bd-4659-b51d-6aec80c439c8-kube-api-access-gzx9v\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.628745 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/d3c490af-d8bd-4659-b51d-6aec80c439c8-builder-dockercfg-ptp88-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.628770 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.628789 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.628807 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.628838 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.628860 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/d3c490af-d8bd-4659-b51d-6aec80c439c8-builder-dockercfg-ptp88-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.628883 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.628914 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.628931 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.628950 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d3c490af-d8bd-4659-b51d-6aec80c439c8-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.628972 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d3c490af-d8bd-4659-b51d-6aec80c439c8-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.629070 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d3c490af-d8bd-4659-b51d-6aec80c439c8-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.629464 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.629532 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.629875 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.629919 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.629938 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d3c490af-d8bd-4659-b51d-6aec80c439c8-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.630016 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.630249 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.630401 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.634653 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/d3c490af-d8bd-4659-b51d-6aec80c439c8-builder-dockercfg-ptp88-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.638194 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/d3c490af-d8bd-4659-b51d-6aec80c439c8-builder-dockercfg-ptp88-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.646033 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzx9v\" (UniqueName: \"kubernetes.io/projected/d3c490af-d8bd-4659-b51d-6aec80c439c8-kube-api-access-gzx9v\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.757210 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.984982 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 08 00:30:55 crc kubenswrapper[4713]: W0308 00:30:55.988978 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3c490af_d8bd_4659_b51d_6aec80c439c8.slice/crio-faabe974ce3bd14531f953a59ae1d347ed0c1ab0c8fbb5df3c5801d95e48c900 WatchSource:0}: Error finding container faabe974ce3bd14531f953a59ae1d347ed0c1ab0c8fbb5df3c5801d95e48c900: Status 404 returned error can't find the container with id faabe974ce3bd14531f953a59ae1d347ed0c1ab0c8fbb5df3c5801d95e48c900 Mar 08 00:30:56 crc kubenswrapper[4713]: I0308 00:30:56.297105 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"d3c490af-d8bd-4659-b51d-6aec80c439c8","Type":"ContainerStarted","Data":"8a4854cb64f8a7f1201a66c3c0908bf11c24711a500d6939eec4e2631a9a94e6"} Mar 08 00:30:56 crc kubenswrapper[4713]: I0308 00:30:56.297194 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"d3c490af-d8bd-4659-b51d-6aec80c439c8","Type":"ContainerStarted","Data":"faabe974ce3bd14531f953a59ae1d347ed0c1ab0c8fbb5df3c5801d95e48c900"} Mar 08 00:30:57 crc kubenswrapper[4713]: I0308 00:30:57.304319 4713 generic.go:334] "Generic (PLEG): container finished" podID="d3c490af-d8bd-4659-b51d-6aec80c439c8" containerID="8a4854cb64f8a7f1201a66c3c0908bf11c24711a500d6939eec4e2631a9a94e6" exitCode=0 Mar 08 00:30:57 crc kubenswrapper[4713]: I0308 00:30:57.304364 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"d3c490af-d8bd-4659-b51d-6aec80c439c8","Type":"ContainerDied","Data":"8a4854cb64f8a7f1201a66c3c0908bf11c24711a500d6939eec4e2631a9a94e6"} Mar 08 00:30:57 crc kubenswrapper[4713]: I0308 00:30:57.820086 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:30:57 crc kubenswrapper[4713]: I0308 00:30:57.820456 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:30:57 crc kubenswrapper[4713]: I0308 00:30:57.868306 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:30:58 crc kubenswrapper[4713]: I0308 00:30:58.311992 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"d3c490af-d8bd-4659-b51d-6aec80c439c8","Type":"ContainerStarted","Data":"fd8002808c5d3f13b3b01cadcdced7f1edb530c711896d763103800ccc5d24e3"} Mar 08 00:30:58 crc kubenswrapper[4713]: I0308 00:30:58.351655 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:30:58 crc kubenswrapper[4713]: I0308 00:30:58.368773 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-bundle-1-build" podStartSLOduration=3.368726661 podStartE2EDuration="3.368726661s" podCreationTimestamp="2026-03-08 00:30:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:30:58.336141596 +0000 UTC m=+1512.455773849" watchObservedRunningTime="2026-03-08 00:30:58.368726661 +0000 UTC m=+1512.488358924" Mar 08 00:30:58 crc kubenswrapper[4713]: I0308 00:30:58.400799 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ntck8"] Mar 08 00:30:59 crc kubenswrapper[4713]: I0308 00:30:59.322909 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_d3c490af-d8bd-4659-b51d-6aec80c439c8/docker-build/0.log" Mar 08 00:30:59 crc kubenswrapper[4713]: I0308 00:30:59.324562 4713 generic.go:334] "Generic (PLEG): container finished" podID="d3c490af-d8bd-4659-b51d-6aec80c439c8" containerID="fd8002808c5d3f13b3b01cadcdced7f1edb530c711896d763103800ccc5d24e3" exitCode=1 Mar 08 00:30:59 crc kubenswrapper[4713]: I0308 00:30:59.324624 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"d3c490af-d8bd-4659-b51d-6aec80c439c8","Type":"ContainerDied","Data":"fd8002808c5d3f13b3b01cadcdced7f1edb530c711896d763103800ccc5d24e3"} Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.332367 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ntck8" podUID="e89ade9c-892d-466e-bfaa-f45237078d28" containerName="registry-server" containerID="cri-o://d282775e1e74077693a918e8838236835dd07b5d6bc464670e0fbcb348261c3d" gracePeriod=2 Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.563358 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_d3c490af-d8bd-4659-b51d-6aec80c439c8/docker-build/0.log" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.564035 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.693575 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.695043 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-ca-bundles\") pod \"d3c490af-d8bd-4659-b51d-6aec80c439c8\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.695084 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-system-configs\") pod \"d3c490af-d8bd-4659-b51d-6aec80c439c8\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.695148 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-container-storage-root\") pod \"d3c490af-d8bd-4659-b51d-6aec80c439c8\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.695177 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d3c490af-d8bd-4659-b51d-6aec80c439c8-buildcachedir\") pod \"d3c490af-d8bd-4659-b51d-6aec80c439c8\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.695225 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzx9v\" (UniqueName: \"kubernetes.io/projected/d3c490af-d8bd-4659-b51d-6aec80c439c8-kube-api-access-gzx9v\") pod \"d3c490af-d8bd-4659-b51d-6aec80c439c8\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.695251 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/d3c490af-d8bd-4659-b51d-6aec80c439c8-builder-dockercfg-ptp88-push\") pod \"d3c490af-d8bd-4659-b51d-6aec80c439c8\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.695286 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-buildworkdir\") pod \"d3c490af-d8bd-4659-b51d-6aec80c439c8\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.695337 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3c490af-d8bd-4659-b51d-6aec80c439c8-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "d3c490af-d8bd-4659-b51d-6aec80c439c8" (UID: "d3c490af-d8bd-4659-b51d-6aec80c439c8"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.695429 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-proxy-ca-bundles\") pod \"d3c490af-d8bd-4659-b51d-6aec80c439c8\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.695636 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d3c490af-d8bd-4659-b51d-6aec80c439c8-node-pullsecrets\") pod \"d3c490af-d8bd-4659-b51d-6aec80c439c8\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.695680 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/d3c490af-d8bd-4659-b51d-6aec80c439c8-builder-dockercfg-ptp88-pull\") pod \"d3c490af-d8bd-4659-b51d-6aec80c439c8\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.695700 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-container-storage-run\") pod \"d3c490af-d8bd-4659-b51d-6aec80c439c8\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.695703 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3c490af-d8bd-4659-b51d-6aec80c439c8-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "d3c490af-d8bd-4659-b51d-6aec80c439c8" (UID: "d3c490af-d8bd-4659-b51d-6aec80c439c8"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.695727 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-blob-cache\") pod \"d3c490af-d8bd-4659-b51d-6aec80c439c8\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.695756 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e89ade9c-892d-466e-bfaa-f45237078d28-utilities\") pod \"e89ade9c-892d-466e-bfaa-f45237078d28\" (UID: \"e89ade9c-892d-466e-bfaa-f45237078d28\") " Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.696045 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "d3c490af-d8bd-4659-b51d-6aec80c439c8" (UID: "d3c490af-d8bd-4659-b51d-6aec80c439c8"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.696167 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.696200 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d3c490af-d8bd-4659-b51d-6aec80c439c8-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.696215 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d3c490af-d8bd-4659-b51d-6aec80c439c8-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.697064 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e89ade9c-892d-466e-bfaa-f45237078d28-utilities" (OuterVolumeSpecName: "utilities") pod "e89ade9c-892d-466e-bfaa-f45237078d28" (UID: "e89ade9c-892d-466e-bfaa-f45237078d28"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.697800 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "d3c490af-d8bd-4659-b51d-6aec80c439c8" (UID: "d3c490af-d8bd-4659-b51d-6aec80c439c8"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.698066 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "d3c490af-d8bd-4659-b51d-6aec80c439c8" (UID: "d3c490af-d8bd-4659-b51d-6aec80c439c8"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.698438 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "d3c490af-d8bd-4659-b51d-6aec80c439c8" (UID: "d3c490af-d8bd-4659-b51d-6aec80c439c8"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.698742 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "d3c490af-d8bd-4659-b51d-6aec80c439c8" (UID: "d3c490af-d8bd-4659-b51d-6aec80c439c8"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.698971 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "d3c490af-d8bd-4659-b51d-6aec80c439c8" (UID: "d3c490af-d8bd-4659-b51d-6aec80c439c8"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.699024 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "d3c490af-d8bd-4659-b51d-6aec80c439c8" (UID: "d3c490af-d8bd-4659-b51d-6aec80c439c8"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.701805 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3c490af-d8bd-4659-b51d-6aec80c439c8-builder-dockercfg-ptp88-pull" (OuterVolumeSpecName: "builder-dockercfg-ptp88-pull") pod "d3c490af-d8bd-4659-b51d-6aec80c439c8" (UID: "d3c490af-d8bd-4659-b51d-6aec80c439c8"). InnerVolumeSpecName "builder-dockercfg-ptp88-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.701873 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3c490af-d8bd-4659-b51d-6aec80c439c8-kube-api-access-gzx9v" (OuterVolumeSpecName: "kube-api-access-gzx9v") pod "d3c490af-d8bd-4659-b51d-6aec80c439c8" (UID: "d3c490af-d8bd-4659-b51d-6aec80c439c8"). InnerVolumeSpecName "kube-api-access-gzx9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.702172 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3c490af-d8bd-4659-b51d-6aec80c439c8-builder-dockercfg-ptp88-push" (OuterVolumeSpecName: "builder-dockercfg-ptp88-push") pod "d3c490af-d8bd-4659-b51d-6aec80c439c8" (UID: "d3c490af-d8bd-4659-b51d-6aec80c439c8"). InnerVolumeSpecName "builder-dockercfg-ptp88-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.796783 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e89ade9c-892d-466e-bfaa-f45237078d28-catalog-content\") pod \"e89ade9c-892d-466e-bfaa-f45237078d28\" (UID: \"e89ade9c-892d-466e-bfaa-f45237078d28\") " Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.796863 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6mb5\" (UniqueName: \"kubernetes.io/projected/e89ade9c-892d-466e-bfaa-f45237078d28-kube-api-access-j6mb5\") pod \"e89ade9c-892d-466e-bfaa-f45237078d28\" (UID: \"e89ade9c-892d-466e-bfaa-f45237078d28\") " Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.797117 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.797130 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.797140 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzx9v\" (UniqueName: \"kubernetes.io/projected/d3c490af-d8bd-4659-b51d-6aec80c439c8-kube-api-access-gzx9v\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.797149 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/d3c490af-d8bd-4659-b51d-6aec80c439c8-builder-dockercfg-ptp88-push\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.797161 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.797170 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.797373 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/d3c490af-d8bd-4659-b51d-6aec80c439c8-builder-dockercfg-ptp88-pull\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.797381 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.797389 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.797397 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e89ade9c-892d-466e-bfaa-f45237078d28-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.800680 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e89ade9c-892d-466e-bfaa-f45237078d28-kube-api-access-j6mb5" (OuterVolumeSpecName: "kube-api-access-j6mb5") pod "e89ade9c-892d-466e-bfaa-f45237078d28" (UID: "e89ade9c-892d-466e-bfaa-f45237078d28"). InnerVolumeSpecName "kube-api-access-j6mb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.852571 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e89ade9c-892d-466e-bfaa-f45237078d28-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e89ade9c-892d-466e-bfaa-f45237078d28" (UID: "e89ade9c-892d-466e-bfaa-f45237078d28"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.898726 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e89ade9c-892d-466e-bfaa-f45237078d28-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.898768 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6mb5\" (UniqueName: \"kubernetes.io/projected/e89ade9c-892d-466e-bfaa-f45237078d28-kube-api-access-j6mb5\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.341079 4713 generic.go:334] "Generic (PLEG): container finished" podID="e89ade9c-892d-466e-bfaa-f45237078d28" containerID="d282775e1e74077693a918e8838236835dd07b5d6bc464670e0fbcb348261c3d" exitCode=0 Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.341182 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.341682 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntck8" event={"ID":"e89ade9c-892d-466e-bfaa-f45237078d28","Type":"ContainerDied","Data":"d282775e1e74077693a918e8838236835dd07b5d6bc464670e0fbcb348261c3d"} Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.341731 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntck8" event={"ID":"e89ade9c-892d-466e-bfaa-f45237078d28","Type":"ContainerDied","Data":"cb867c50000d8b99afc6f684e5f719dee646322428c1a08e8dabaed01ce9d7d2"} Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.341750 4713 scope.go:117] "RemoveContainer" containerID="d282775e1e74077693a918e8838236835dd07b5d6bc464670e0fbcb348261c3d" Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.343200 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_d3c490af-d8bd-4659-b51d-6aec80c439c8/docker-build/0.log" Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.343731 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"d3c490af-d8bd-4659-b51d-6aec80c439c8","Type":"ContainerDied","Data":"faabe974ce3bd14531f953a59ae1d347ed0c1ab0c8fbb5df3c5801d95e48c900"} Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.343753 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="faabe974ce3bd14531f953a59ae1d347ed0c1ab0c8fbb5df3c5801d95e48c900" Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.343808 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.369521 4713 scope.go:117] "RemoveContainer" containerID="27b86b2af86fd38b42111200b9e048dd6a75f0ef35854300e5908e9b58b8acff" Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.417080 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ntck8"] Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.422173 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ntck8"] Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.427607 4713 scope.go:117] "RemoveContainer" containerID="5f03d3220ec8d81401dd84eb9a971c62c0f5e5428b77b5f03d27706e5b80e4a4" Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.444000 4713 scope.go:117] "RemoveContainer" containerID="d282775e1e74077693a918e8838236835dd07b5d6bc464670e0fbcb348261c3d" Mar 08 00:31:01 crc kubenswrapper[4713]: E0308 00:31:01.444514 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d282775e1e74077693a918e8838236835dd07b5d6bc464670e0fbcb348261c3d\": container with ID starting with d282775e1e74077693a918e8838236835dd07b5d6bc464670e0fbcb348261c3d not found: ID does not exist" containerID="d282775e1e74077693a918e8838236835dd07b5d6bc464670e0fbcb348261c3d" Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.444548 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d282775e1e74077693a918e8838236835dd07b5d6bc464670e0fbcb348261c3d"} err="failed to get container status \"d282775e1e74077693a918e8838236835dd07b5d6bc464670e0fbcb348261c3d\": rpc error: code = NotFound desc = could not find container \"d282775e1e74077693a918e8838236835dd07b5d6bc464670e0fbcb348261c3d\": container with ID starting with d282775e1e74077693a918e8838236835dd07b5d6bc464670e0fbcb348261c3d not found: ID does not exist" Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.444569 4713 scope.go:117] "RemoveContainer" containerID="27b86b2af86fd38b42111200b9e048dd6a75f0ef35854300e5908e9b58b8acff" Mar 08 00:31:01 crc kubenswrapper[4713]: E0308 00:31:01.445140 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27b86b2af86fd38b42111200b9e048dd6a75f0ef35854300e5908e9b58b8acff\": container with ID starting with 27b86b2af86fd38b42111200b9e048dd6a75f0ef35854300e5908e9b58b8acff not found: ID does not exist" containerID="27b86b2af86fd38b42111200b9e048dd6a75f0ef35854300e5908e9b58b8acff" Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.445195 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27b86b2af86fd38b42111200b9e048dd6a75f0ef35854300e5908e9b58b8acff"} err="failed to get container status \"27b86b2af86fd38b42111200b9e048dd6a75f0ef35854300e5908e9b58b8acff\": rpc error: code = NotFound desc = could not find container \"27b86b2af86fd38b42111200b9e048dd6a75f0ef35854300e5908e9b58b8acff\": container with ID starting with 27b86b2af86fd38b42111200b9e048dd6a75f0ef35854300e5908e9b58b8acff not found: ID does not exist" Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.445230 4713 scope.go:117] "RemoveContainer" containerID="5f03d3220ec8d81401dd84eb9a971c62c0f5e5428b77b5f03d27706e5b80e4a4" Mar 08 00:31:01 crc kubenswrapper[4713]: E0308 00:31:01.445559 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f03d3220ec8d81401dd84eb9a971c62c0f5e5428b77b5f03d27706e5b80e4a4\": container with ID starting with 5f03d3220ec8d81401dd84eb9a971c62c0f5e5428b77b5f03d27706e5b80e4a4 not found: ID does not exist" containerID="5f03d3220ec8d81401dd84eb9a971c62c0f5e5428b77b5f03d27706e5b80e4a4" Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.445589 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f03d3220ec8d81401dd84eb9a971c62c0f5e5428b77b5f03d27706e5b80e4a4"} err="failed to get container status \"5f03d3220ec8d81401dd84eb9a971c62c0f5e5428b77b5f03d27706e5b80e4a4\": rpc error: code = NotFound desc = could not find container \"5f03d3220ec8d81401dd84eb9a971c62c0f5e5428b77b5f03d27706e5b80e4a4\": container with ID starting with 5f03d3220ec8d81401dd84eb9a971c62c0f5e5428b77b5f03d27706e5b80e4a4 not found: ID does not exist" Mar 08 00:31:02 crc kubenswrapper[4713]: I0308 00:31:02.550609 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e89ade9c-892d-466e-bfaa-f45237078d28" path="/var/lib/kubelet/pods/e89ade9c-892d-466e-bfaa-f45237078d28/volumes" Mar 08 00:31:06 crc kubenswrapper[4713]: I0308 00:31:06.056209 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 08 00:31:06 crc kubenswrapper[4713]: I0308 00:31:06.063085 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 08 00:31:06 crc kubenswrapper[4713]: I0308 00:31:06.555367 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3c490af-d8bd-4659-b51d-6aec80c439c8" path="/var/lib/kubelet/pods/d3c490af-d8bd-4659-b51d-6aec80c439c8/volumes" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.633935 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Mar 08 00:31:07 crc kubenswrapper[4713]: E0308 00:31:07.634342 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e89ade9c-892d-466e-bfaa-f45237078d28" containerName="extract-utilities" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.634360 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e89ade9c-892d-466e-bfaa-f45237078d28" containerName="extract-utilities" Mar 08 00:31:07 crc kubenswrapper[4713]: E0308 00:31:07.634376 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e89ade9c-892d-466e-bfaa-f45237078d28" containerName="registry-server" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.634382 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e89ade9c-892d-466e-bfaa-f45237078d28" containerName="registry-server" Mar 08 00:31:07 crc kubenswrapper[4713]: E0308 00:31:07.634396 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e89ade9c-892d-466e-bfaa-f45237078d28" containerName="extract-content" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.634402 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e89ade9c-892d-466e-bfaa-f45237078d28" containerName="extract-content" Mar 08 00:31:07 crc kubenswrapper[4713]: E0308 00:31:07.634412 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3c490af-d8bd-4659-b51d-6aec80c439c8" containerName="manage-dockerfile" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.634418 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3c490af-d8bd-4659-b51d-6aec80c439c8" containerName="manage-dockerfile" Mar 08 00:31:07 crc kubenswrapper[4713]: E0308 00:31:07.634428 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3c490af-d8bd-4659-b51d-6aec80c439c8" containerName="docker-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.634433 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3c490af-d8bd-4659-b51d-6aec80c439c8" containerName="docker-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.634534 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="e89ade9c-892d-466e-bfaa-f45237078d28" containerName="registry-server" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.634548 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3c490af-d8bd-4659-b51d-6aec80c439c8" containerName="docker-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.637219 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.643423 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-ca" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.643598 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-ptp88" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.643684 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-global-ca" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.643882 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.645482 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-sys-config" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.804005 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.804331 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.804357 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.804379 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.804508 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.804546 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/84b3ed06-5d45-4c0f-a4b4-bec838490219-builder-dockercfg-ptp88-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.804580 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gx2j\" (UniqueName: \"kubernetes.io/projected/84b3ed06-5d45-4c0f-a4b4-bec838490219-kube-api-access-4gx2j\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.804603 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/84b3ed06-5d45-4c0f-a4b4-bec838490219-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.804624 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.804641 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/84b3ed06-5d45-4c0f-a4b4-bec838490219-builder-dockercfg-ptp88-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.804681 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/84b3ed06-5d45-4c0f-a4b4-bec838490219-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.804707 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.906191 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/84b3ed06-5d45-4c0f-a4b4-bec838490219-builder-dockercfg-ptp88-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.906338 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gx2j\" (UniqueName: \"kubernetes.io/projected/84b3ed06-5d45-4c0f-a4b4-bec838490219-kube-api-access-4gx2j\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.906403 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/84b3ed06-5d45-4c0f-a4b4-bec838490219-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.906484 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.906544 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/84b3ed06-5d45-4c0f-a4b4-bec838490219-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.906549 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/84b3ed06-5d45-4c0f-a4b4-bec838490219-builder-dockercfg-ptp88-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.906722 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/84b3ed06-5d45-4c0f-a4b4-bec838490219-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.906773 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.906796 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.906818 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.906876 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.906884 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/84b3ed06-5d45-4c0f-a4b4-bec838490219-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.906910 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.906951 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.907522 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.907618 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.907671 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.908027 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.908185 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.908446 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.908451 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.913101 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/84b3ed06-5d45-4c0f-a4b4-bec838490219-builder-dockercfg-ptp88-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.917542 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/84b3ed06-5d45-4c0f-a4b4-bec838490219-builder-dockercfg-ptp88-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.923481 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gx2j\" (UniqueName: \"kubernetes.io/projected/84b3ed06-5d45-4c0f-a4b4-bec838490219-kube-api-access-4gx2j\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.954063 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:08 crc kubenswrapper[4713]: I0308 00:31:08.151258 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Mar 08 00:31:08 crc kubenswrapper[4713]: I0308 00:31:08.392739 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"84b3ed06-5d45-4c0f-a4b4-bec838490219","Type":"ContainerStarted","Data":"ed2ca7deb7d3bd4de4031839877bb86e8604fb184312c12ef9bb0197c61c0b3c"} Mar 08 00:31:09 crc kubenswrapper[4713]: I0308 00:31:09.399934 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"84b3ed06-5d45-4c0f-a4b4-bec838490219","Type":"ContainerStarted","Data":"47decae92d17c21a2177c133a5d16c644933bb532b7b55fff3cd35090a2adb3c"} Mar 08 00:31:10 crc kubenswrapper[4713]: I0308 00:31:10.407931 4713 generic.go:334] "Generic (PLEG): container finished" podID="84b3ed06-5d45-4c0f-a4b4-bec838490219" containerID="47decae92d17c21a2177c133a5d16c644933bb532b7b55fff3cd35090a2adb3c" exitCode=0 Mar 08 00:31:10 crc kubenswrapper[4713]: I0308 00:31:10.408037 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"84b3ed06-5d45-4c0f-a4b4-bec838490219","Type":"ContainerDied","Data":"47decae92d17c21a2177c133a5d16c644933bb532b7b55fff3cd35090a2adb3c"} Mar 08 00:31:11 crc kubenswrapper[4713]: I0308 00:31:11.415747 4713 generic.go:334] "Generic (PLEG): container finished" podID="84b3ed06-5d45-4c0f-a4b4-bec838490219" containerID="f5d5ac707738aa5e892d7bcbb3f9759e8231d82d6983c233b86849777da5ccaa" exitCode=0 Mar 08 00:31:11 crc kubenswrapper[4713]: I0308 00:31:11.416075 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"84b3ed06-5d45-4c0f-a4b4-bec838490219","Type":"ContainerDied","Data":"f5d5ac707738aa5e892d7bcbb3f9759e8231d82d6983c233b86849777da5ccaa"} Mar 08 00:31:11 crc kubenswrapper[4713]: I0308 00:31:11.460216 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-2-build_84b3ed06-5d45-4c0f-a4b4-bec838490219/manage-dockerfile/0.log" Mar 08 00:31:12 crc kubenswrapper[4713]: I0308 00:31:12.423612 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"84b3ed06-5d45-4c0f-a4b4-bec838490219","Type":"ContainerStarted","Data":"2414699b5905b9face59685a2ba3cebf47868ce7ce3bf47b1261c2b3eb36d1ee"} Mar 08 00:31:12 crc kubenswrapper[4713]: I0308 00:31:12.457673 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-bundle-2-build" podStartSLOduration=5.457651641 podStartE2EDuration="5.457651641s" podCreationTimestamp="2026-03-08 00:31:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:31:12.451374754 +0000 UTC m=+1526.571007007" watchObservedRunningTime="2026-03-08 00:31:12.457651641 +0000 UTC m=+1526.577283874" Mar 08 00:31:14 crc kubenswrapper[4713]: I0308 00:31:14.440486 4713 generic.go:334] "Generic (PLEG): container finished" podID="84b3ed06-5d45-4c0f-a4b4-bec838490219" containerID="2414699b5905b9face59685a2ba3cebf47868ce7ce3bf47b1261c2b3eb36d1ee" exitCode=0 Mar 08 00:31:14 crc kubenswrapper[4713]: I0308 00:31:14.440543 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"84b3ed06-5d45-4c0f-a4b4-bec838490219","Type":"ContainerDied","Data":"2414699b5905b9face59685a2ba3cebf47868ce7ce3bf47b1261c2b3eb36d1ee"} Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.740985 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.916718 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-container-storage-run\") pod \"84b3ed06-5d45-4c0f-a4b4-bec838490219\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.917916 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gx2j\" (UniqueName: \"kubernetes.io/projected/84b3ed06-5d45-4c0f-a4b4-bec838490219-kube-api-access-4gx2j\") pod \"84b3ed06-5d45-4c0f-a4b4-bec838490219\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.917945 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/84b3ed06-5d45-4c0f-a4b4-bec838490219-builder-dockercfg-ptp88-pull\") pod \"84b3ed06-5d45-4c0f-a4b4-bec838490219\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.917979 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-buildworkdir\") pod \"84b3ed06-5d45-4c0f-a4b4-bec838490219\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.918008 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/84b3ed06-5d45-4c0f-a4b4-bec838490219-builder-dockercfg-ptp88-push\") pod \"84b3ed06-5d45-4c0f-a4b4-bec838490219\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.918058 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-proxy-ca-bundles\") pod \"84b3ed06-5d45-4c0f-a4b4-bec838490219\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.918083 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-ca-bundles\") pod \"84b3ed06-5d45-4c0f-a4b4-bec838490219\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.917866 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "84b3ed06-5d45-4c0f-a4b4-bec838490219" (UID: "84b3ed06-5d45-4c0f-a4b4-bec838490219"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.919048 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/84b3ed06-5d45-4c0f-a4b4-bec838490219-buildcachedir\") pod \"84b3ed06-5d45-4c0f-a4b4-bec838490219\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.919091 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-system-configs\") pod \"84b3ed06-5d45-4c0f-a4b4-bec838490219\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.919120 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/84b3ed06-5d45-4c0f-a4b4-bec838490219-node-pullsecrets\") pod \"84b3ed06-5d45-4c0f-a4b4-bec838490219\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.919144 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-blob-cache\") pod \"84b3ed06-5d45-4c0f-a4b4-bec838490219\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.919132 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/84b3ed06-5d45-4c0f-a4b4-bec838490219-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "84b3ed06-5d45-4c0f-a4b4-bec838490219" (UID: "84b3ed06-5d45-4c0f-a4b4-bec838490219"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.919186 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-container-storage-root\") pod \"84b3ed06-5d45-4c0f-a4b4-bec838490219\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.919241 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/84b3ed06-5d45-4c0f-a4b4-bec838490219-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "84b3ed06-5d45-4c0f-a4b4-bec838490219" (UID: "84b3ed06-5d45-4c0f-a4b4-bec838490219"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.919652 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "84b3ed06-5d45-4c0f-a4b4-bec838490219" (UID: "84b3ed06-5d45-4c0f-a4b4-bec838490219"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.919725 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "84b3ed06-5d45-4c0f-a4b4-bec838490219" (UID: "84b3ed06-5d45-4c0f-a4b4-bec838490219"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.919911 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.919929 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/84b3ed06-5d45-4c0f-a4b4-bec838490219-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.919943 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.919954 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.919964 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/84b3ed06-5d45-4c0f-a4b4-bec838490219-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.920016 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "84b3ed06-5d45-4c0f-a4b4-bec838490219" (UID: "84b3ed06-5d45-4c0f-a4b4-bec838490219"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.920092 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "84b3ed06-5d45-4c0f-a4b4-bec838490219" (UID: "84b3ed06-5d45-4c0f-a4b4-bec838490219"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.921623 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "84b3ed06-5d45-4c0f-a4b4-bec838490219" (UID: "84b3ed06-5d45-4c0f-a4b4-bec838490219"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.923386 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84b3ed06-5d45-4c0f-a4b4-bec838490219-kube-api-access-4gx2j" (OuterVolumeSpecName: "kube-api-access-4gx2j") pod "84b3ed06-5d45-4c0f-a4b4-bec838490219" (UID: "84b3ed06-5d45-4c0f-a4b4-bec838490219"). InnerVolumeSpecName "kube-api-access-4gx2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.923601 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84b3ed06-5d45-4c0f-a4b4-bec838490219-builder-dockercfg-ptp88-pull" (OuterVolumeSpecName: "builder-dockercfg-ptp88-pull") pod "84b3ed06-5d45-4c0f-a4b4-bec838490219" (UID: "84b3ed06-5d45-4c0f-a4b4-bec838490219"). InnerVolumeSpecName "builder-dockercfg-ptp88-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.923636 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84b3ed06-5d45-4c0f-a4b4-bec838490219-builder-dockercfg-ptp88-push" (OuterVolumeSpecName: "builder-dockercfg-ptp88-push") pod "84b3ed06-5d45-4c0f-a4b4-bec838490219" (UID: "84b3ed06-5d45-4c0f-a4b4-bec838490219"). InnerVolumeSpecName "builder-dockercfg-ptp88-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.924747 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "84b3ed06-5d45-4c0f-a4b4-bec838490219" (UID: "84b3ed06-5d45-4c0f-a4b4-bec838490219"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:16 crc kubenswrapper[4713]: I0308 00:31:16.021062 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:16 crc kubenswrapper[4713]: I0308 00:31:16.021122 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/84b3ed06-5d45-4c0f-a4b4-bec838490219-builder-dockercfg-ptp88-push\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:16 crc kubenswrapper[4713]: I0308 00:31:16.021170 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:16 crc kubenswrapper[4713]: I0308 00:31:16.021199 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:16 crc kubenswrapper[4713]: I0308 00:31:16.021227 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:16 crc kubenswrapper[4713]: I0308 00:31:16.021246 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gx2j\" (UniqueName: \"kubernetes.io/projected/84b3ed06-5d45-4c0f-a4b4-bec838490219-kube-api-access-4gx2j\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:16 crc kubenswrapper[4713]: I0308 00:31:16.021263 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/84b3ed06-5d45-4c0f-a4b4-bec838490219-builder-dockercfg-ptp88-pull\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:16 crc kubenswrapper[4713]: I0308 00:31:16.455391 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"84b3ed06-5d45-4c0f-a4b4-bec838490219","Type":"ContainerDied","Data":"ed2ca7deb7d3bd4de4031839877bb86e8604fb184312c12ef9bb0197c61c0b3c"} Mar 08 00:31:16 crc kubenswrapper[4713]: I0308 00:31:16.455440 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed2ca7deb7d3bd4de4031839877bb86e8604fb184312c12ef9bb0197c61c0b3c" Mar 08 00:31:16 crc kubenswrapper[4713]: I0308 00:31:16.455450 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.495035 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 08 00:31:19 crc kubenswrapper[4713]: E0308 00:31:19.495653 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84b3ed06-5d45-4c0f-a4b4-bec838490219" containerName="manage-dockerfile" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.495671 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="84b3ed06-5d45-4c0f-a4b4-bec838490219" containerName="manage-dockerfile" Mar 08 00:31:19 crc kubenswrapper[4713]: E0308 00:31:19.495681 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84b3ed06-5d45-4c0f-a4b4-bec838490219" containerName="git-clone" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.495689 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="84b3ed06-5d45-4c0f-a4b4-bec838490219" containerName="git-clone" Mar 08 00:31:19 crc kubenswrapper[4713]: E0308 00:31:19.495711 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84b3ed06-5d45-4c0f-a4b4-bec838490219" containerName="docker-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.495719 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="84b3ed06-5d45-4c0f-a4b4-bec838490219" containerName="docker-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.495885 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="84b3ed06-5d45-4c0f-a4b4-bec838490219" containerName="docker-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.496570 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.499761 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-ca" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.499919 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-ptp88" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.499951 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-sys-config" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.500292 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-global-ca" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.513501 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.666654 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/41a78c77-3173-4e12-b68e-a9421ccb4298-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.666697 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.666715 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.666733 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.666779 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.666845 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.666887 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/41a78c77-3173-4e12-b68e-a9421ccb4298-builder-dockercfg-ptp88-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.666930 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.666954 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/41a78c77-3173-4e12-b68e-a9421ccb4298-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.666978 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-482h5\" (UniqueName: \"kubernetes.io/projected/41a78c77-3173-4e12-b68e-a9421ccb4298-kube-api-access-482h5\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.667001 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.667030 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/41a78c77-3173-4e12-b68e-a9421ccb4298-builder-dockercfg-ptp88-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.768551 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.768983 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/41a78c77-3173-4e12-b68e-a9421ccb4298-builder-dockercfg-ptp88-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.769136 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.769263 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/41a78c77-3173-4e12-b68e-a9421ccb4298-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.769366 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/41a78c77-3173-4e12-b68e-a9421ccb4298-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.769193 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.769560 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-482h5\" (UniqueName: \"kubernetes.io/projected/41a78c77-3173-4e12-b68e-a9421ccb4298-kube-api-access-482h5\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.769666 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.769780 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/41a78c77-3173-4e12-b68e-a9421ccb4298-builder-dockercfg-ptp88-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.769958 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/41a78c77-3173-4e12-b68e-a9421ccb4298-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.770086 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.770020 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.770246 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/41a78c77-3173-4e12-b68e-a9421ccb4298-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.770195 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.770403 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.770517 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.770340 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.770484 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.770791 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.771295 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.772608 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.775157 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/41a78c77-3173-4e12-b68e-a9421ccb4298-builder-dockercfg-ptp88-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.775170 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/41a78c77-3173-4e12-b68e-a9421ccb4298-builder-dockercfg-ptp88-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.789508 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-482h5\" (UniqueName: \"kubernetes.io/projected/41a78c77-3173-4e12-b68e-a9421ccb4298-kube-api-access-482h5\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.813460 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:20 crc kubenswrapper[4713]: I0308 00:31:20.011322 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 08 00:31:20 crc kubenswrapper[4713]: I0308 00:31:20.481887 4713 generic.go:334] "Generic (PLEG): container finished" podID="41a78c77-3173-4e12-b68e-a9421ccb4298" containerID="c082051221894646965936ec6155e8aca998188d9e68b92365d5716b581ebfa0" exitCode=0 Mar 08 00:31:20 crc kubenswrapper[4713]: I0308 00:31:20.481944 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"41a78c77-3173-4e12-b68e-a9421ccb4298","Type":"ContainerDied","Data":"c082051221894646965936ec6155e8aca998188d9e68b92365d5716b581ebfa0"} Mar 08 00:31:20 crc kubenswrapper[4713]: I0308 00:31:20.482168 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"41a78c77-3173-4e12-b68e-a9421ccb4298","Type":"ContainerStarted","Data":"b2aab60f46ae0fe29ee29d5026cab7a6c14b56493903f12295b6bde8dae8b9de"} Mar 08 00:31:21 crc kubenswrapper[4713]: I0308 00:31:21.490987 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_41a78c77-3173-4e12-b68e-a9421ccb4298/docker-build/0.log" Mar 08 00:31:21 crc kubenswrapper[4713]: I0308 00:31:21.491453 4713 generic.go:334] "Generic (PLEG): container finished" podID="41a78c77-3173-4e12-b68e-a9421ccb4298" containerID="a21bd8ee1ac8242c094817b3835b31572654184e63adec117111a47c5246ee20" exitCode=1 Mar 08 00:31:21 crc kubenswrapper[4713]: I0308 00:31:21.491492 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"41a78c77-3173-4e12-b68e-a9421ccb4298","Type":"ContainerDied","Data":"a21bd8ee1ac8242c094817b3835b31572654184e63adec117111a47c5246ee20"} Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.700710 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_41a78c77-3173-4e12-b68e-a9421ccb4298/docker-build/0.log" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.701275 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.807650 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/41a78c77-3173-4e12-b68e-a9421ccb4298-buildcachedir\") pod \"41a78c77-3173-4e12-b68e-a9421ccb4298\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.807712 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-system-configs\") pod \"41a78c77-3173-4e12-b68e-a9421ccb4298\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.807814 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-container-storage-run\") pod \"41a78c77-3173-4e12-b68e-a9421ccb4298\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.807874 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-ca-bundles\") pod \"41a78c77-3173-4e12-b68e-a9421ccb4298\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.807907 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-482h5\" (UniqueName: \"kubernetes.io/projected/41a78c77-3173-4e12-b68e-a9421ccb4298-kube-api-access-482h5\") pod \"41a78c77-3173-4e12-b68e-a9421ccb4298\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.807912 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41a78c77-3173-4e12-b68e-a9421ccb4298-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "41a78c77-3173-4e12-b68e-a9421ccb4298" (UID: "41a78c77-3173-4e12-b68e-a9421ccb4298"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.807932 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-container-storage-root\") pod \"41a78c77-3173-4e12-b68e-a9421ccb4298\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.808009 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/41a78c77-3173-4e12-b68e-a9421ccb4298-builder-dockercfg-ptp88-pull\") pod \"41a78c77-3173-4e12-b68e-a9421ccb4298\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.808086 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-build-blob-cache\") pod \"41a78c77-3173-4e12-b68e-a9421ccb4298\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.808127 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-buildworkdir\") pod \"41a78c77-3173-4e12-b68e-a9421ccb4298\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.808154 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/41a78c77-3173-4e12-b68e-a9421ccb4298-node-pullsecrets\") pod \"41a78c77-3173-4e12-b68e-a9421ccb4298\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.808196 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/41a78c77-3173-4e12-b68e-a9421ccb4298-builder-dockercfg-ptp88-push\") pod \"41a78c77-3173-4e12-b68e-a9421ccb4298\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.808226 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-proxy-ca-bundles\") pod \"41a78c77-3173-4e12-b68e-a9421ccb4298\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.808449 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "41a78c77-3173-4e12-b68e-a9421ccb4298" (UID: "41a78c77-3173-4e12-b68e-a9421ccb4298"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.808508 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41a78c77-3173-4e12-b68e-a9421ccb4298-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "41a78c77-3173-4e12-b68e-a9421ccb4298" (UID: "41a78c77-3173-4e12-b68e-a9421ccb4298"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.808578 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.808590 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/41a78c77-3173-4e12-b68e-a9421ccb4298-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.808598 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/41a78c77-3173-4e12-b68e-a9421ccb4298-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.808947 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "41a78c77-3173-4e12-b68e-a9421ccb4298" (UID: "41a78c77-3173-4e12-b68e-a9421ccb4298"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.809198 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "41a78c77-3173-4e12-b68e-a9421ccb4298" (UID: "41a78c77-3173-4e12-b68e-a9421ccb4298"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.809272 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "41a78c77-3173-4e12-b68e-a9421ccb4298" (UID: "41a78c77-3173-4e12-b68e-a9421ccb4298"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.809296 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "41a78c77-3173-4e12-b68e-a9421ccb4298" (UID: "41a78c77-3173-4e12-b68e-a9421ccb4298"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.809495 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "41a78c77-3173-4e12-b68e-a9421ccb4298" (UID: "41a78c77-3173-4e12-b68e-a9421ccb4298"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.809752 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "41a78c77-3173-4e12-b68e-a9421ccb4298" (UID: "41a78c77-3173-4e12-b68e-a9421ccb4298"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.813297 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41a78c77-3173-4e12-b68e-a9421ccb4298-builder-dockercfg-ptp88-push" (OuterVolumeSpecName: "builder-dockercfg-ptp88-push") pod "41a78c77-3173-4e12-b68e-a9421ccb4298" (UID: "41a78c77-3173-4e12-b68e-a9421ccb4298"). InnerVolumeSpecName "builder-dockercfg-ptp88-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.813394 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41a78c77-3173-4e12-b68e-a9421ccb4298-builder-dockercfg-ptp88-pull" (OuterVolumeSpecName: "builder-dockercfg-ptp88-pull") pod "41a78c77-3173-4e12-b68e-a9421ccb4298" (UID: "41a78c77-3173-4e12-b68e-a9421ccb4298"). InnerVolumeSpecName "builder-dockercfg-ptp88-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.813660 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41a78c77-3173-4e12-b68e-a9421ccb4298-kube-api-access-482h5" (OuterVolumeSpecName: "kube-api-access-482h5") pod "41a78c77-3173-4e12-b68e-a9421ccb4298" (UID: "41a78c77-3173-4e12-b68e-a9421ccb4298"). InnerVolumeSpecName "kube-api-access-482h5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.909284 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.909324 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.909337 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-482h5\" (UniqueName: \"kubernetes.io/projected/41a78c77-3173-4e12-b68e-a9421ccb4298-kube-api-access-482h5\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.909345 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.909354 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/41a78c77-3173-4e12-b68e-a9421ccb4298-builder-dockercfg-ptp88-pull\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.909363 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.909373 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/41a78c77-3173-4e12-b68e-a9421ccb4298-builder-dockercfg-ptp88-push\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.909383 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.909395 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:23 crc kubenswrapper[4713]: I0308 00:31:23.504328 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_41a78c77-3173-4e12-b68e-a9421ccb4298/docker-build/0.log" Mar 08 00:31:23 crc kubenswrapper[4713]: I0308 00:31:23.505091 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"41a78c77-3173-4e12-b68e-a9421ccb4298","Type":"ContainerDied","Data":"b2aab60f46ae0fe29ee29d5026cab7a6c14b56493903f12295b6bde8dae8b9de"} Mar 08 00:31:23 crc kubenswrapper[4713]: I0308 00:31:23.505131 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2aab60f46ae0fe29ee29d5026cab7a6c14b56493903f12295b6bde8dae8b9de" Mar 08 00:31:23 crc kubenswrapper[4713]: I0308 00:31:23.505167 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:30 crc kubenswrapper[4713]: I0308 00:31:30.000268 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 08 00:31:30 crc kubenswrapper[4713]: I0308 00:31:30.005789 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 08 00:31:30 crc kubenswrapper[4713]: I0308 00:31:30.549349 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41a78c77-3173-4e12-b68e-a9421ccb4298" path="/var/lib/kubelet/pods/41a78c77-3173-4e12-b68e-a9421ccb4298/volumes" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.615544 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Mar 08 00:31:31 crc kubenswrapper[4713]: E0308 00:31:31.616144 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41a78c77-3173-4e12-b68e-a9421ccb4298" containerName="manage-dockerfile" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.616161 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="41a78c77-3173-4e12-b68e-a9421ccb4298" containerName="manage-dockerfile" Mar 08 00:31:31 crc kubenswrapper[4713]: E0308 00:31:31.616186 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41a78c77-3173-4e12-b68e-a9421ccb4298" containerName="docker-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.616194 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="41a78c77-3173-4e12-b68e-a9421ccb4298" containerName="docker-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.616321 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="41a78c77-3173-4e12-b68e-a9421ccb4298" containerName="docker-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.617266 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.619349 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-global-ca" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.619539 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-sys-config" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.619562 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-ptp88" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.619601 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-ca" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.643103 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.724727 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-builder-dockercfg-ptp88-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.724956 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.725151 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.725192 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.725350 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.725409 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-builder-dockercfg-ptp88-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.725435 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.725451 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.725476 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqwtf\" (UniqueName: \"kubernetes.io/projected/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-kube-api-access-gqwtf\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.725493 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.725509 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.725588 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.827187 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.827234 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-builder-dockercfg-ptp88-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.827258 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.827274 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.827295 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqwtf\" (UniqueName: \"kubernetes.io/projected/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-kube-api-access-gqwtf\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.827311 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.827326 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.827341 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.827351 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.827361 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-builder-dockercfg-ptp88-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.827438 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.827492 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.827517 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.827892 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.828381 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.828478 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.828554 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.828644 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.828790 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.828863 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.829153 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.842454 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-builder-dockercfg-ptp88-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.842783 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-builder-dockercfg-ptp88-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.845616 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqwtf\" (UniqueName: \"kubernetes.io/projected/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-kube-api-access-gqwtf\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.930810 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:32 crc kubenswrapper[4713]: I0308 00:31:32.168441 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Mar 08 00:31:32 crc kubenswrapper[4713]: I0308 00:31:32.561341 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"eb0ec6e5-4cc4-4c52-a320-a163af42eca6","Type":"ContainerStarted","Data":"57a8fe67e5a0289227ece8804d0a9f763244b1e53546172e25e72bce650ca856"} Mar 08 00:31:32 crc kubenswrapper[4713]: I0308 00:31:32.561384 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"eb0ec6e5-4cc4-4c52-a320-a163af42eca6","Type":"ContainerStarted","Data":"df639c98458e3df6cf7ef96ddee63017cdbf9ad7b00bd6af4b3f8230fbca306d"} Mar 08 00:31:34 crc kubenswrapper[4713]: I0308 00:31:34.574538 4713 generic.go:334] "Generic (PLEG): container finished" podID="eb0ec6e5-4cc4-4c52-a320-a163af42eca6" containerID="57a8fe67e5a0289227ece8804d0a9f763244b1e53546172e25e72bce650ca856" exitCode=0 Mar 08 00:31:34 crc kubenswrapper[4713]: I0308 00:31:34.574839 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"eb0ec6e5-4cc4-4c52-a320-a163af42eca6","Type":"ContainerDied","Data":"57a8fe67e5a0289227ece8804d0a9f763244b1e53546172e25e72bce650ca856"} Mar 08 00:31:35 crc kubenswrapper[4713]: I0308 00:31:35.583361 4713 generic.go:334] "Generic (PLEG): container finished" podID="eb0ec6e5-4cc4-4c52-a320-a163af42eca6" containerID="9b66fc43b9ecb8837083ebc5b05d6a1cc956eabe67d66e3c5d86e4e7327451a5" exitCode=0 Mar 08 00:31:35 crc kubenswrapper[4713]: I0308 00:31:35.583465 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"eb0ec6e5-4cc4-4c52-a320-a163af42eca6","Type":"ContainerDied","Data":"9b66fc43b9ecb8837083ebc5b05d6a1cc956eabe67d66e3c5d86e4e7327451a5"} Mar 08 00:31:35 crc kubenswrapper[4713]: I0308 00:31:35.625101 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-2-build_eb0ec6e5-4cc4-4c52-a320-a163af42eca6/manage-dockerfile/0.log" Mar 08 00:31:36 crc kubenswrapper[4713]: I0308 00:31:36.597740 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"eb0ec6e5-4cc4-4c52-a320-a163af42eca6","Type":"ContainerStarted","Data":"bb7b06753d12f2b66f7473e7fccd80076bc01abcef253cc3b1ff7bcaddce480c"} Mar 08 00:31:36 crc kubenswrapper[4713]: I0308 00:31:36.636770 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-bundle-2-build" podStartSLOduration=5.636748292 podStartE2EDuration="5.636748292s" podCreationTimestamp="2026-03-08 00:31:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:31:36.62836931 +0000 UTC m=+1550.748001563" watchObservedRunningTime="2026-03-08 00:31:36.636748292 +0000 UTC m=+1550.756380525" Mar 08 00:31:39 crc kubenswrapper[4713]: I0308 00:31:39.625446 4713 generic.go:334] "Generic (PLEG): container finished" podID="eb0ec6e5-4cc4-4c52-a320-a163af42eca6" containerID="bb7b06753d12f2b66f7473e7fccd80076bc01abcef253cc3b1ff7bcaddce480c" exitCode=0 Mar 08 00:31:39 crc kubenswrapper[4713]: I0308 00:31:39.625490 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"eb0ec6e5-4cc4-4c52-a320-a163af42eca6","Type":"ContainerDied","Data":"bb7b06753d12f2b66f7473e7fccd80076bc01abcef253cc3b1ff7bcaddce480c"} Mar 08 00:31:40 crc kubenswrapper[4713]: I0308 00:31:40.855753 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.053411 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-builder-dockercfg-ptp88-pull\") pod \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.053484 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-proxy-ca-bundles\") pod \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.053553 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-builder-dockercfg-ptp88-push\") pod \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.053603 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-buildworkdir\") pod \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.053644 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-container-storage-root\") pod \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.053674 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-ca-bundles\") pod \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.053702 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-system-configs\") pod \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.053730 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqwtf\" (UniqueName: \"kubernetes.io/projected/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-kube-api-access-gqwtf\") pod \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.053794 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-blob-cache\") pod \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.054090 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-buildcachedir\") pod \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.054122 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-container-storage-run\") pod \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.054170 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-node-pullsecrets\") pod \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.054296 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "eb0ec6e5-4cc4-4c52-a320-a163af42eca6" (UID: "eb0ec6e5-4cc4-4c52-a320-a163af42eca6"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.054466 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "eb0ec6e5-4cc4-4c52-a320-a163af42eca6" (UID: "eb0ec6e5-4cc4-4c52-a320-a163af42eca6"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.054571 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "eb0ec6e5-4cc4-4c52-a320-a163af42eca6" (UID: "eb0ec6e5-4cc4-4c52-a320-a163af42eca6"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.054648 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.054668 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.054679 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.054713 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "eb0ec6e5-4cc4-4c52-a320-a163af42eca6" (UID: "eb0ec6e5-4cc4-4c52-a320-a163af42eca6"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.055080 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "eb0ec6e5-4cc4-4c52-a320-a163af42eca6" (UID: "eb0ec6e5-4cc4-4c52-a320-a163af42eca6"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.055165 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "eb0ec6e5-4cc4-4c52-a320-a163af42eca6" (UID: "eb0ec6e5-4cc4-4c52-a320-a163af42eca6"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.056280 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "eb0ec6e5-4cc4-4c52-a320-a163af42eca6" (UID: "eb0ec6e5-4cc4-4c52-a320-a163af42eca6"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.056379 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "eb0ec6e5-4cc4-4c52-a320-a163af42eca6" (UID: "eb0ec6e5-4cc4-4c52-a320-a163af42eca6"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.058183 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "eb0ec6e5-4cc4-4c52-a320-a163af42eca6" (UID: "eb0ec6e5-4cc4-4c52-a320-a163af42eca6"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.059584 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-builder-dockercfg-ptp88-pull" (OuterVolumeSpecName: "builder-dockercfg-ptp88-pull") pod "eb0ec6e5-4cc4-4c52-a320-a163af42eca6" (UID: "eb0ec6e5-4cc4-4c52-a320-a163af42eca6"). InnerVolumeSpecName "builder-dockercfg-ptp88-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.059920 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-builder-dockercfg-ptp88-push" (OuterVolumeSpecName: "builder-dockercfg-ptp88-push") pod "eb0ec6e5-4cc4-4c52-a320-a163af42eca6" (UID: "eb0ec6e5-4cc4-4c52-a320-a163af42eca6"). InnerVolumeSpecName "builder-dockercfg-ptp88-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.060124 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-kube-api-access-gqwtf" (OuterVolumeSpecName: "kube-api-access-gqwtf") pod "eb0ec6e5-4cc4-4c52-a320-a163af42eca6" (UID: "eb0ec6e5-4cc4-4c52-a320-a163af42eca6"). InnerVolumeSpecName "kube-api-access-gqwtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.155188 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-builder-dockercfg-ptp88-pull\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.155221 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.155233 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-builder-dockercfg-ptp88-push\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.155245 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.155256 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.155264 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.155273 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqwtf\" (UniqueName: \"kubernetes.io/projected/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-kube-api-access-gqwtf\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.155282 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.155290 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.643538 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"eb0ec6e5-4cc4-4c52-a320-a163af42eca6","Type":"ContainerDied","Data":"df639c98458e3df6cf7ef96ddee63017cdbf9ad7b00bd6af4b3f8230fbca306d"} Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.643611 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df639c98458e3df6cf7ef96ddee63017cdbf9ad7b00bd6af4b3f8230fbca306d" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.643609 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.834968 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Mar 08 00:31:56 crc kubenswrapper[4713]: E0308 00:31:56.835667 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb0ec6e5-4cc4-4c52-a320-a163af42eca6" containerName="docker-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.835679 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb0ec6e5-4cc4-4c52-a320-a163af42eca6" containerName="docker-build" Mar 08 00:31:56 crc kubenswrapper[4713]: E0308 00:31:56.835690 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb0ec6e5-4cc4-4c52-a320-a163af42eca6" containerName="manage-dockerfile" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.835697 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb0ec6e5-4cc4-4c52-a320-a163af42eca6" containerName="manage-dockerfile" Mar 08 00:31:56 crc kubenswrapper[4713]: E0308 00:31:56.835708 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb0ec6e5-4cc4-4c52-a320-a163af42eca6" containerName="git-clone" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.835713 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb0ec6e5-4cc4-4c52-a320-a163af42eca6" containerName="git-clone" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.835804 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb0ec6e5-4cc4-4c52-a320-a163af42eca6" containerName="docker-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.836610 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.839474 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-framework-index-dockercfg" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.839638 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-global-ca" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.839869 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-ptp88" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.840723 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-sys-config" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.840913 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-ca" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.856156 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.856835 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.856865 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.856890 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.856911 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-builder-dockercfg-ptp88-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.856928 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.856943 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.856962 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.856996 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.857023 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.857053 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.857077 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.857104 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-builder-dockercfg-ptp88-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.857124 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpmmr\" (UniqueName: \"kubernetes.io/projected/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-kube-api-access-lpmmr\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.960265 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.960353 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.960403 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.960428 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.960462 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-builder-dockercfg-ptp88-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.960490 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpmmr\" (UniqueName: \"kubernetes.io/projected/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-kube-api-access-lpmmr\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.960513 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.960537 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.960562 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.960584 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-builder-dockercfg-ptp88-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.960608 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.960645 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.960688 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.961262 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.961749 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.962229 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.962456 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.962478 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.962549 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.962552 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.962877 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.968579 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.969852 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.971985 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-builder-dockercfg-ptp88-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.975254 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-builder-dockercfg-ptp88-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.981471 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpmmr\" (UniqueName: \"kubernetes.io/projected/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-kube-api-access-lpmmr\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:57 crc kubenswrapper[4713]: I0308 00:31:57.154095 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:57 crc kubenswrapper[4713]: I0308 00:31:57.378114 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Mar 08 00:31:57 crc kubenswrapper[4713]: I0308 00:31:57.754390 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab","Type":"ContainerStarted","Data":"0904ca818df9cb0a3b1a7f6e4f990dfc98b1a5af9c3112ab7a99391125d44f3e"} Mar 08 00:31:57 crc kubenswrapper[4713]: I0308 00:31:57.754450 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab","Type":"ContainerStarted","Data":"68961efedf2da8822c35e5e96f5b92fab19325b9a9c28b4dcb20edbc175d01cb"} Mar 08 00:31:58 crc kubenswrapper[4713]: I0308 00:31:58.762566 4713 generic.go:334] "Generic (PLEG): container finished" podID="a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" containerID="0904ca818df9cb0a3b1a7f6e4f990dfc98b1a5af9c3112ab7a99391125d44f3e" exitCode=0 Mar 08 00:31:58 crc kubenswrapper[4713]: I0308 00:31:58.762614 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab","Type":"ContainerDied","Data":"0904ca818df9cb0a3b1a7f6e4f990dfc98b1a5af9c3112ab7a99391125d44f3e"} Mar 08 00:31:59 crc kubenswrapper[4713]: I0308 00:31:59.771067 4713 generic.go:334] "Generic (PLEG): container finished" podID="a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" containerID="5fe140ef81009f2c519e74f40fd71a40332e5dc01b26fb1ae49ef3ff0efa8c16" exitCode=0 Mar 08 00:31:59 crc kubenswrapper[4713]: I0308 00:31:59.771118 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab","Type":"ContainerDied","Data":"5fe140ef81009f2c519e74f40fd71a40332e5dc01b26fb1ae49ef3ff0efa8c16"} Mar 08 00:31:59 crc kubenswrapper[4713]: I0308 00:31:59.804607 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-framework-index-1-build_a1f08b8e-b7bf-4e1a-934f-b3dd95201eab/manage-dockerfile/0.log" Mar 08 00:32:00 crc kubenswrapper[4713]: I0308 00:32:00.132803 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548832-6k4lz"] Mar 08 00:32:00 crc kubenswrapper[4713]: I0308 00:32:00.133943 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548832-6k4lz" Mar 08 00:32:00 crc kubenswrapper[4713]: I0308 00:32:00.136635 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:32:00 crc kubenswrapper[4713]: I0308 00:32:00.136758 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:32:00 crc kubenswrapper[4713]: I0308 00:32:00.136846 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:32:00 crc kubenswrapper[4713]: I0308 00:32:00.141294 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548832-6k4lz"] Mar 08 00:32:00 crc kubenswrapper[4713]: I0308 00:32:00.307339 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwvbr\" (UniqueName: \"kubernetes.io/projected/d0a13b2b-064d-4323-8d5c-d86f76405f38-kube-api-access-jwvbr\") pod \"auto-csr-approver-29548832-6k4lz\" (UID: \"d0a13b2b-064d-4323-8d5c-d86f76405f38\") " pod="openshift-infra/auto-csr-approver-29548832-6k4lz" Mar 08 00:32:00 crc kubenswrapper[4713]: I0308 00:32:00.408679 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwvbr\" (UniqueName: \"kubernetes.io/projected/d0a13b2b-064d-4323-8d5c-d86f76405f38-kube-api-access-jwvbr\") pod \"auto-csr-approver-29548832-6k4lz\" (UID: \"d0a13b2b-064d-4323-8d5c-d86f76405f38\") " pod="openshift-infra/auto-csr-approver-29548832-6k4lz" Mar 08 00:32:00 crc kubenswrapper[4713]: I0308 00:32:00.428680 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwvbr\" (UniqueName: \"kubernetes.io/projected/d0a13b2b-064d-4323-8d5c-d86f76405f38-kube-api-access-jwvbr\") pod \"auto-csr-approver-29548832-6k4lz\" (UID: \"d0a13b2b-064d-4323-8d5c-d86f76405f38\") " pod="openshift-infra/auto-csr-approver-29548832-6k4lz" Mar 08 00:32:00 crc kubenswrapper[4713]: I0308 00:32:00.451409 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548832-6k4lz" Mar 08 00:32:00 crc kubenswrapper[4713]: I0308 00:32:00.641460 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548832-6k4lz"] Mar 08 00:32:00 crc kubenswrapper[4713]: W0308 00:32:00.651005 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0a13b2b_064d_4323_8d5c_d86f76405f38.slice/crio-3405b0ac1a9177e914b5c6c6c23949359b1a7e5cfff2352a3ad2e15156c6a7ea WatchSource:0}: Error finding container 3405b0ac1a9177e914b5c6c6c23949359b1a7e5cfff2352a3ad2e15156c6a7ea: Status 404 returned error can't find the container with id 3405b0ac1a9177e914b5c6c6c23949359b1a7e5cfff2352a3ad2e15156c6a7ea Mar 08 00:32:00 crc kubenswrapper[4713]: I0308 00:32:00.779874 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab","Type":"ContainerStarted","Data":"0ac54551228085177532c06723bf629c1b218e85b223f3abe113f31369692f4c"} Mar 08 00:32:00 crc kubenswrapper[4713]: I0308 00:32:00.782087 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548832-6k4lz" event={"ID":"d0a13b2b-064d-4323-8d5c-d86f76405f38","Type":"ContainerStarted","Data":"3405b0ac1a9177e914b5c6c6c23949359b1a7e5cfff2352a3ad2e15156c6a7ea"} Mar 08 00:32:00 crc kubenswrapper[4713]: I0308 00:32:00.821641 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-framework-index-1-build" podStartSLOduration=4.821620852 podStartE2EDuration="4.821620852s" podCreationTimestamp="2026-03-08 00:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:32:00.813535407 +0000 UTC m=+1574.933167650" watchObservedRunningTime="2026-03-08 00:32:00.821620852 +0000 UTC m=+1574.941253085" Mar 08 00:32:02 crc kubenswrapper[4713]: I0308 00:32:02.796351 4713 generic.go:334] "Generic (PLEG): container finished" podID="d0a13b2b-064d-4323-8d5c-d86f76405f38" containerID="d06ee3cd17ca3058dd1d41ca8e61fbdf1a5ff7196264bb612799359dc20d5255" exitCode=0 Mar 08 00:32:02 crc kubenswrapper[4713]: I0308 00:32:02.796410 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548832-6k4lz" event={"ID":"d0a13b2b-064d-4323-8d5c-d86f76405f38","Type":"ContainerDied","Data":"d06ee3cd17ca3058dd1d41ca8e61fbdf1a5ff7196264bb612799359dc20d5255"} Mar 08 00:32:04 crc kubenswrapper[4713]: I0308 00:32:04.020178 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548832-6k4lz" Mar 08 00:32:04 crc kubenswrapper[4713]: I0308 00:32:04.154286 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwvbr\" (UniqueName: \"kubernetes.io/projected/d0a13b2b-064d-4323-8d5c-d86f76405f38-kube-api-access-jwvbr\") pod \"d0a13b2b-064d-4323-8d5c-d86f76405f38\" (UID: \"d0a13b2b-064d-4323-8d5c-d86f76405f38\") " Mar 08 00:32:04 crc kubenswrapper[4713]: I0308 00:32:04.159954 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0a13b2b-064d-4323-8d5c-d86f76405f38-kube-api-access-jwvbr" (OuterVolumeSpecName: "kube-api-access-jwvbr") pod "d0a13b2b-064d-4323-8d5c-d86f76405f38" (UID: "d0a13b2b-064d-4323-8d5c-d86f76405f38"). InnerVolumeSpecName "kube-api-access-jwvbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:32:04 crc kubenswrapper[4713]: I0308 00:32:04.255216 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwvbr\" (UniqueName: \"kubernetes.io/projected/d0a13b2b-064d-4323-8d5c-d86f76405f38-kube-api-access-jwvbr\") on node \"crc\" DevicePath \"\"" Mar 08 00:32:04 crc kubenswrapper[4713]: I0308 00:32:04.810550 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548832-6k4lz" event={"ID":"d0a13b2b-064d-4323-8d5c-d86f76405f38","Type":"ContainerDied","Data":"3405b0ac1a9177e914b5c6c6c23949359b1a7e5cfff2352a3ad2e15156c6a7ea"} Mar 08 00:32:04 crc kubenswrapper[4713]: I0308 00:32:04.810898 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3405b0ac1a9177e914b5c6c6c23949359b1a7e5cfff2352a3ad2e15156c6a7ea" Mar 08 00:32:04 crc kubenswrapper[4713]: I0308 00:32:04.810615 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548832-6k4lz" Mar 08 00:32:05 crc kubenswrapper[4713]: I0308 00:32:05.076528 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29548826-fhk5r"] Mar 08 00:32:05 crc kubenswrapper[4713]: I0308 00:32:05.095718 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29548826-fhk5r"] Mar 08 00:32:06 crc kubenswrapper[4713]: I0308 00:32:06.548396 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45fc1987-0bdc-476c-9315-18ddbf570461" path="/var/lib/kubelet/pods/45fc1987-0bdc-476c-9315-18ddbf570461/volumes" Mar 08 00:32:15 crc kubenswrapper[4713]: I0308 00:32:15.294019 4713 scope.go:117] "RemoveContainer" containerID="76cb1ca43446adb6dc230f530d8737aea0a1011651185fc5861e17e4b5ae2a6c" Mar 08 00:32:28 crc kubenswrapper[4713]: I0308 00:32:28.956034 4713 generic.go:334] "Generic (PLEG): container finished" podID="a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" containerID="0ac54551228085177532c06723bf629c1b218e85b223f3abe113f31369692f4c" exitCode=0 Mar 08 00:32:28 crc kubenswrapper[4713]: I0308 00:32:28.956123 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab","Type":"ContainerDied","Data":"0ac54551228085177532c06723bf629c1b218e85b223f3abe113f31369692f4c"} Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.188785 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.387287 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-buildcachedir\") pod \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.387365 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-builder-dockercfg-ptp88-push\") pod \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.387409 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-buildworkdir\") pod \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.387457 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-node-pullsecrets\") pod \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.387508 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpmmr\" (UniqueName: \"kubernetes.io/projected/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-kube-api-access-lpmmr\") pod \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.387446 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" (UID: "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.387551 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-proxy-ca-bundles\") pod \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.387652 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-builder-dockercfg-ptp88-pull\") pod \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.387716 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-container-storage-root\") pod \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.387749 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-ca-bundles\") pod \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.387789 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.387874 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-system-configs\") pod \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.387911 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-blob-cache\") pod \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.387955 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-container-storage-run\") pod \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.388483 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.389054 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" (UID: "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.389178 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" (UID: "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.389373 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" (UID: "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.390216 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" (UID: "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.390653 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" (UID: "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.390789 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" (UID: "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.393849 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-builder-dockercfg-ptp88-push" (OuterVolumeSpecName: "builder-dockercfg-ptp88-push") pod "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" (UID: "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab"). InnerVolumeSpecName "builder-dockercfg-ptp88-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.394171 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-service-telemetry-framework-index-dockercfg-user-build-volume" (OuterVolumeSpecName: "service-telemetry-framework-index-dockercfg-user-build-volume") pod "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" (UID: "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab"). InnerVolumeSpecName "service-telemetry-framework-index-dockercfg-user-build-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.394729 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-builder-dockercfg-ptp88-pull" (OuterVolumeSpecName: "builder-dockercfg-ptp88-pull") pod "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" (UID: "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab"). InnerVolumeSpecName "builder-dockercfg-ptp88-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.395781 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-kube-api-access-lpmmr" (OuterVolumeSpecName: "kube-api-access-lpmmr") pod "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" (UID: "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab"). InnerVolumeSpecName "kube-api-access-lpmmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.489136 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.489353 4713 reconciler_common.go:293] "Volume detached for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-service-telemetry-framework-index-dockercfg-user-build-volume\") on node \"crc\" DevicePath \"\"" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.489442 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.489531 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.489623 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-builder-dockercfg-ptp88-push\") on node \"crc\" DevicePath \"\"" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.489697 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.489772 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.489864 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpmmr\" (UniqueName: \"kubernetes.io/projected/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-kube-api-access-lpmmr\") on node \"crc\" DevicePath \"\"" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.489950 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.490021 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-builder-dockercfg-ptp88-pull\") on node \"crc\" DevicePath \"\"" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.625321 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" (UID: "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.698186 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.969160 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab","Type":"ContainerDied","Data":"68961efedf2da8822c35e5e96f5b92fab19325b9a9c28b4dcb20edbc175d01cb"} Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.969195 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68961efedf2da8822c35e5e96f5b92fab19325b9a9c28b4dcb20edbc175d01cb" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.969550 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:32:31 crc kubenswrapper[4713]: I0308 00:32:31.389807 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" (UID: "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:32:31 crc kubenswrapper[4713]: I0308 00:32:31.405054 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 08 00:32:32 crc kubenswrapper[4713]: I0308 00:32:32.170704 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-qz9xc"] Mar 08 00:32:32 crc kubenswrapper[4713]: E0308 00:32:32.170992 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" containerName="manage-dockerfile" Mar 08 00:32:32 crc kubenswrapper[4713]: I0308 00:32:32.171008 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" containerName="manage-dockerfile" Mar 08 00:32:32 crc kubenswrapper[4713]: E0308 00:32:32.171021 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" containerName="docker-build" Mar 08 00:32:32 crc kubenswrapper[4713]: I0308 00:32:32.171028 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" containerName="docker-build" Mar 08 00:32:32 crc kubenswrapper[4713]: E0308 00:32:32.171040 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" containerName="git-clone" Mar 08 00:32:32 crc kubenswrapper[4713]: I0308 00:32:32.171046 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" containerName="git-clone" Mar 08 00:32:32 crc kubenswrapper[4713]: E0308 00:32:32.171057 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a13b2b-064d-4323-8d5c-d86f76405f38" containerName="oc" Mar 08 00:32:32 crc kubenswrapper[4713]: I0308 00:32:32.171063 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a13b2b-064d-4323-8d5c-d86f76405f38" containerName="oc" Mar 08 00:32:32 crc kubenswrapper[4713]: I0308 00:32:32.171163 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" containerName="docker-build" Mar 08 00:32:32 crc kubenswrapper[4713]: I0308 00:32:32.171172 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a13b2b-064d-4323-8d5c-d86f76405f38" containerName="oc" Mar 08 00:32:32 crc kubenswrapper[4713]: I0308 00:32:32.171574 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-qz9xc" Mar 08 00:32:32 crc kubenswrapper[4713]: I0308 00:32:32.176211 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"infrawatch-operators-dockercfg-dkxrf" Mar 08 00:32:32 crc kubenswrapper[4713]: I0308 00:32:32.183282 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-qz9xc"] Mar 08 00:32:32 crc kubenswrapper[4713]: I0308 00:32:32.215950 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bsvm\" (UniqueName: \"kubernetes.io/projected/aab425e0-6643-4517-893f-6a638b8ae66d-kube-api-access-9bsvm\") pod \"infrawatch-operators-qz9xc\" (UID: \"aab425e0-6643-4517-893f-6a638b8ae66d\") " pod="service-telemetry/infrawatch-operators-qz9xc" Mar 08 00:32:32 crc kubenswrapper[4713]: I0308 00:32:32.317472 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bsvm\" (UniqueName: \"kubernetes.io/projected/aab425e0-6643-4517-893f-6a638b8ae66d-kube-api-access-9bsvm\") pod \"infrawatch-operators-qz9xc\" (UID: \"aab425e0-6643-4517-893f-6a638b8ae66d\") " pod="service-telemetry/infrawatch-operators-qz9xc" Mar 08 00:32:32 crc kubenswrapper[4713]: I0308 00:32:32.349806 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bsvm\" (UniqueName: \"kubernetes.io/projected/aab425e0-6643-4517-893f-6a638b8ae66d-kube-api-access-9bsvm\") pod \"infrawatch-operators-qz9xc\" (UID: \"aab425e0-6643-4517-893f-6a638b8ae66d\") " pod="service-telemetry/infrawatch-operators-qz9xc" Mar 08 00:32:32 crc kubenswrapper[4713]: I0308 00:32:32.488976 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-qz9xc" Mar 08 00:32:33 crc kubenswrapper[4713]: I0308 00:32:33.868794 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-qz9xc"] Mar 08 00:32:33 crc kubenswrapper[4713]: I0308 00:32:33.993977 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-qz9xc" event={"ID":"aab425e0-6643-4517-893f-6a638b8ae66d","Type":"ContainerStarted","Data":"0f0b7586d1a19f7104b1b86f7584f513a6655a21578c4d11fbfb55a6aaa1dd71"} Mar 08 00:32:34 crc kubenswrapper[4713]: I0308 00:32:34.500953 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:32:34 crc kubenswrapper[4713]: I0308 00:32:34.501337 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:32:36 crc kubenswrapper[4713]: I0308 00:32:36.974644 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-qz9xc"] Mar 08 00:32:37 crc kubenswrapper[4713]: I0308 00:32:37.774092 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-rx6bq"] Mar 08 00:32:37 crc kubenswrapper[4713]: I0308 00:32:37.775169 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-rx6bq" Mar 08 00:32:37 crc kubenswrapper[4713]: I0308 00:32:37.789072 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-rx6bq"] Mar 08 00:32:37 crc kubenswrapper[4713]: I0308 00:32:37.890722 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpzp4\" (UniqueName: \"kubernetes.io/projected/40f96514-d597-436a-8158-0535f61fa6f8-kube-api-access-bpzp4\") pod \"infrawatch-operators-rx6bq\" (UID: \"40f96514-d597-436a-8158-0535f61fa6f8\") " pod="service-telemetry/infrawatch-operators-rx6bq" Mar 08 00:32:37 crc kubenswrapper[4713]: I0308 00:32:37.992339 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpzp4\" (UniqueName: \"kubernetes.io/projected/40f96514-d597-436a-8158-0535f61fa6f8-kube-api-access-bpzp4\") pod \"infrawatch-operators-rx6bq\" (UID: \"40f96514-d597-436a-8158-0535f61fa6f8\") " pod="service-telemetry/infrawatch-operators-rx6bq" Mar 08 00:32:38 crc kubenswrapper[4713]: I0308 00:32:38.022556 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpzp4\" (UniqueName: \"kubernetes.io/projected/40f96514-d597-436a-8158-0535f61fa6f8-kube-api-access-bpzp4\") pod \"infrawatch-operators-rx6bq\" (UID: \"40f96514-d597-436a-8158-0535f61fa6f8\") " pod="service-telemetry/infrawatch-operators-rx6bq" Mar 08 00:32:38 crc kubenswrapper[4713]: I0308 00:32:38.102253 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-rx6bq" Mar 08 00:32:44 crc kubenswrapper[4713]: I0308 00:32:44.691184 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-rx6bq"] Mar 08 00:32:46 crc kubenswrapper[4713]: W0308 00:32:46.797298 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40f96514_d597_436a_8158_0535f61fa6f8.slice/crio-fd21ae6241edae87432b1ba3ed248ed5ef3d6e8aaeb3aa7a25cc35ba2fa77dbb WatchSource:0}: Error finding container fd21ae6241edae87432b1ba3ed248ed5ef3d6e8aaeb3aa7a25cc35ba2fa77dbb: Status 404 returned error can't find the container with id fd21ae6241edae87432b1ba3ed248ed5ef3d6e8aaeb3aa7a25cc35ba2fa77dbb Mar 08 00:32:46 crc kubenswrapper[4713]: E0308 00:32:46.862293 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest" Mar 08 00:32:46 crc kubenswrapper[4713]: E0308 00:32:46.862461 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9bsvm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infrawatch-operators-qz9xc_service-telemetry(aab425e0-6643-4517-893f-6a638b8ae66d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 00:32:46 crc kubenswrapper[4713]: E0308 00:32:46.863632 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/infrawatch-operators-qz9xc" podUID="aab425e0-6643-4517-893f-6a638b8ae66d" Mar 08 00:32:47 crc kubenswrapper[4713]: I0308 00:32:47.090739 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-rx6bq" event={"ID":"40f96514-d597-436a-8158-0535f61fa6f8","Type":"ContainerStarted","Data":"acd23d11a52b08e960c2e2ebe308e196e7d9017fcce21b077ff99189646db2cd"} Mar 08 00:32:47 crc kubenswrapper[4713]: I0308 00:32:47.090889 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-rx6bq" event={"ID":"40f96514-d597-436a-8158-0535f61fa6f8","Type":"ContainerStarted","Data":"fd21ae6241edae87432b1ba3ed248ed5ef3d6e8aaeb3aa7a25cc35ba2fa77dbb"} Mar 08 00:32:47 crc kubenswrapper[4713]: I0308 00:32:47.328300 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-qz9xc" Mar 08 00:32:47 crc kubenswrapper[4713]: I0308 00:32:47.345982 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-rx6bq" podStartSLOduration=10.234165673 podStartE2EDuration="10.345960743s" podCreationTimestamp="2026-03-08 00:32:37 +0000 UTC" firstStartedPulling="2026-03-08 00:32:46.80031753 +0000 UTC m=+1620.919949763" lastFinishedPulling="2026-03-08 00:32:46.9121126 +0000 UTC m=+1621.031744833" observedRunningTime="2026-03-08 00:32:47.125195629 +0000 UTC m=+1621.244827862" watchObservedRunningTime="2026-03-08 00:32:47.345960743 +0000 UTC m=+1621.465592996" Mar 08 00:32:47 crc kubenswrapper[4713]: I0308 00:32:47.512691 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bsvm\" (UniqueName: \"kubernetes.io/projected/aab425e0-6643-4517-893f-6a638b8ae66d-kube-api-access-9bsvm\") pod \"aab425e0-6643-4517-893f-6a638b8ae66d\" (UID: \"aab425e0-6643-4517-893f-6a638b8ae66d\") " Mar 08 00:32:47 crc kubenswrapper[4713]: I0308 00:32:47.517688 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aab425e0-6643-4517-893f-6a638b8ae66d-kube-api-access-9bsvm" (OuterVolumeSpecName: "kube-api-access-9bsvm") pod "aab425e0-6643-4517-893f-6a638b8ae66d" (UID: "aab425e0-6643-4517-893f-6a638b8ae66d"). InnerVolumeSpecName "kube-api-access-9bsvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:32:47 crc kubenswrapper[4713]: I0308 00:32:47.614435 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bsvm\" (UniqueName: \"kubernetes.io/projected/aab425e0-6643-4517-893f-6a638b8ae66d-kube-api-access-9bsvm\") on node \"crc\" DevicePath \"\"" Mar 08 00:32:48 crc kubenswrapper[4713]: I0308 00:32:48.097617 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-qz9xc" event={"ID":"aab425e0-6643-4517-893f-6a638b8ae66d","Type":"ContainerDied","Data":"0f0b7586d1a19f7104b1b86f7584f513a6655a21578c4d11fbfb55a6aaa1dd71"} Mar 08 00:32:48 crc kubenswrapper[4713]: I0308 00:32:48.097662 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-qz9xc" Mar 08 00:32:48 crc kubenswrapper[4713]: I0308 00:32:48.102601 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/infrawatch-operators-rx6bq" Mar 08 00:32:48 crc kubenswrapper[4713]: I0308 00:32:48.102640 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/infrawatch-operators-rx6bq" Mar 08 00:32:48 crc kubenswrapper[4713]: I0308 00:32:48.134478 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/infrawatch-operators-rx6bq" Mar 08 00:32:48 crc kubenswrapper[4713]: I0308 00:32:48.152710 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-qz9xc"] Mar 08 00:32:48 crc kubenswrapper[4713]: I0308 00:32:48.159612 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/infrawatch-operators-qz9xc"] Mar 08 00:32:48 crc kubenswrapper[4713]: I0308 00:32:48.550522 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aab425e0-6643-4517-893f-6a638b8ae66d" path="/var/lib/kubelet/pods/aab425e0-6643-4517-893f-6a638b8ae66d/volumes" Mar 08 00:32:55 crc kubenswrapper[4713]: I0308 00:32:55.780145 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-svj4c"] Mar 08 00:32:55 crc kubenswrapper[4713]: I0308 00:32:55.782766 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:32:55 crc kubenswrapper[4713]: I0308 00:32:55.794465 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-svj4c"] Mar 08 00:32:55 crc kubenswrapper[4713]: I0308 00:32:55.920737 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dca9cde6-7c79-47bd-aacc-d326268e5595-utilities\") pod \"community-operators-svj4c\" (UID: \"dca9cde6-7c79-47bd-aacc-d326268e5595\") " pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:32:55 crc kubenswrapper[4713]: I0308 00:32:55.920855 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xw25\" (UniqueName: \"kubernetes.io/projected/dca9cde6-7c79-47bd-aacc-d326268e5595-kube-api-access-5xw25\") pod \"community-operators-svj4c\" (UID: \"dca9cde6-7c79-47bd-aacc-d326268e5595\") " pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:32:55 crc kubenswrapper[4713]: I0308 00:32:55.921390 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dca9cde6-7c79-47bd-aacc-d326268e5595-catalog-content\") pod \"community-operators-svj4c\" (UID: \"dca9cde6-7c79-47bd-aacc-d326268e5595\") " pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:32:56 crc kubenswrapper[4713]: I0308 00:32:56.022418 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dca9cde6-7c79-47bd-aacc-d326268e5595-catalog-content\") pod \"community-operators-svj4c\" (UID: \"dca9cde6-7c79-47bd-aacc-d326268e5595\") " pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:32:56 crc kubenswrapper[4713]: I0308 00:32:56.022567 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dca9cde6-7c79-47bd-aacc-d326268e5595-utilities\") pod \"community-operators-svj4c\" (UID: \"dca9cde6-7c79-47bd-aacc-d326268e5595\") " pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:32:56 crc kubenswrapper[4713]: I0308 00:32:56.022633 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xw25\" (UniqueName: \"kubernetes.io/projected/dca9cde6-7c79-47bd-aacc-d326268e5595-kube-api-access-5xw25\") pod \"community-operators-svj4c\" (UID: \"dca9cde6-7c79-47bd-aacc-d326268e5595\") " pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:32:56 crc kubenswrapper[4713]: I0308 00:32:56.022926 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dca9cde6-7c79-47bd-aacc-d326268e5595-catalog-content\") pod \"community-operators-svj4c\" (UID: \"dca9cde6-7c79-47bd-aacc-d326268e5595\") " pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:32:56 crc kubenswrapper[4713]: I0308 00:32:56.023055 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dca9cde6-7c79-47bd-aacc-d326268e5595-utilities\") pod \"community-operators-svj4c\" (UID: \"dca9cde6-7c79-47bd-aacc-d326268e5595\") " pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:32:56 crc kubenswrapper[4713]: I0308 00:32:56.052736 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xw25\" (UniqueName: \"kubernetes.io/projected/dca9cde6-7c79-47bd-aacc-d326268e5595-kube-api-access-5xw25\") pod \"community-operators-svj4c\" (UID: \"dca9cde6-7c79-47bd-aacc-d326268e5595\") " pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:32:56 crc kubenswrapper[4713]: I0308 00:32:56.112709 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:32:56 crc kubenswrapper[4713]: I0308 00:32:56.552398 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-svj4c"] Mar 08 00:32:57 crc kubenswrapper[4713]: I0308 00:32:57.160882 4713 generic.go:334] "Generic (PLEG): container finished" podID="dca9cde6-7c79-47bd-aacc-d326268e5595" containerID="99596bdc3f5d694747953229651ae0f0cae64257f60bc1a5f0c511f98c9e4785" exitCode=0 Mar 08 00:32:57 crc kubenswrapper[4713]: I0308 00:32:57.162029 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svj4c" event={"ID":"dca9cde6-7c79-47bd-aacc-d326268e5595","Type":"ContainerDied","Data":"99596bdc3f5d694747953229651ae0f0cae64257f60bc1a5f0c511f98c9e4785"} Mar 08 00:32:57 crc kubenswrapper[4713]: I0308 00:32:57.162344 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svj4c" event={"ID":"dca9cde6-7c79-47bd-aacc-d326268e5595","Type":"ContainerStarted","Data":"9f82c3dff1939e1e71812e7fa1c087d46f2fb778390ac98acb85ef0038e4d1b2"} Mar 08 00:32:58 crc kubenswrapper[4713]: I0308 00:32:58.132331 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/infrawatch-operators-rx6bq" Mar 08 00:32:58 crc kubenswrapper[4713]: I0308 00:32:58.174424 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svj4c" event={"ID":"dca9cde6-7c79-47bd-aacc-d326268e5595","Type":"ContainerStarted","Data":"ca2ec636ba7fe1f82e7f359ccdbed3ec39b94744bbafbc17519b011d5fb3967a"} Mar 08 00:32:59 crc kubenswrapper[4713]: I0308 00:32:59.181751 4713 generic.go:334] "Generic (PLEG): container finished" podID="dca9cde6-7c79-47bd-aacc-d326268e5595" containerID="ca2ec636ba7fe1f82e7f359ccdbed3ec39b94744bbafbc17519b011d5fb3967a" exitCode=0 Mar 08 00:32:59 crc kubenswrapper[4713]: I0308 00:32:59.181930 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svj4c" event={"ID":"dca9cde6-7c79-47bd-aacc-d326268e5595","Type":"ContainerDied","Data":"ca2ec636ba7fe1f82e7f359ccdbed3ec39b94744bbafbc17519b011d5fb3967a"} Mar 08 00:32:59 crc kubenswrapper[4713]: I0308 00:32:59.815441 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8"] Mar 08 00:32:59 crc kubenswrapper[4713]: I0308 00:32:59.816923 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" Mar 08 00:32:59 crc kubenswrapper[4713]: I0308 00:32:59.830029 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8"] Mar 08 00:32:59 crc kubenswrapper[4713]: I0308 00:32:59.874161 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3468be8a-1655-46bd-869e-a1f4653984f1-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8\" (UID: \"3468be8a-1655-46bd-869e-a1f4653984f1\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" Mar 08 00:32:59 crc kubenswrapper[4713]: I0308 00:32:59.874226 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3468be8a-1655-46bd-869e-a1f4653984f1-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8\" (UID: \"3468be8a-1655-46bd-869e-a1f4653984f1\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" Mar 08 00:32:59 crc kubenswrapper[4713]: I0308 00:32:59.874257 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz8k4\" (UniqueName: \"kubernetes.io/projected/3468be8a-1655-46bd-869e-a1f4653984f1-kube-api-access-pz8k4\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8\" (UID: \"3468be8a-1655-46bd-869e-a1f4653984f1\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" Mar 08 00:32:59 crc kubenswrapper[4713]: I0308 00:32:59.975080 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3468be8a-1655-46bd-869e-a1f4653984f1-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8\" (UID: \"3468be8a-1655-46bd-869e-a1f4653984f1\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" Mar 08 00:32:59 crc kubenswrapper[4713]: I0308 00:32:59.975166 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3468be8a-1655-46bd-869e-a1f4653984f1-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8\" (UID: \"3468be8a-1655-46bd-869e-a1f4653984f1\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" Mar 08 00:32:59 crc kubenswrapper[4713]: I0308 00:32:59.975198 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz8k4\" (UniqueName: \"kubernetes.io/projected/3468be8a-1655-46bd-869e-a1f4653984f1-kube-api-access-pz8k4\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8\" (UID: \"3468be8a-1655-46bd-869e-a1f4653984f1\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" Mar 08 00:32:59 crc kubenswrapper[4713]: I0308 00:32:59.975968 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3468be8a-1655-46bd-869e-a1f4653984f1-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8\" (UID: \"3468be8a-1655-46bd-869e-a1f4653984f1\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" Mar 08 00:32:59 crc kubenswrapper[4713]: I0308 00:32:59.976291 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3468be8a-1655-46bd-869e-a1f4653984f1-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8\" (UID: \"3468be8a-1655-46bd-869e-a1f4653984f1\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" Mar 08 00:32:59 crc kubenswrapper[4713]: I0308 00:32:59.994763 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz8k4\" (UniqueName: \"kubernetes.io/projected/3468be8a-1655-46bd-869e-a1f4653984f1-kube-api-access-pz8k4\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8\" (UID: \"3468be8a-1655-46bd-869e-a1f4653984f1\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" Mar 08 00:33:00 crc kubenswrapper[4713]: I0308 00:33:00.142256 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" Mar 08 00:33:00 crc kubenswrapper[4713]: I0308 00:33:00.195704 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svj4c" event={"ID":"dca9cde6-7c79-47bd-aacc-d326268e5595","Type":"ContainerStarted","Data":"ee7835fb72e5d8baa1fd18b584ef3b9ccbe7fd71ca43e706485e4854b1e7b19b"} Mar 08 00:33:00 crc kubenswrapper[4713]: I0308 00:33:00.210360 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-svj4c" podStartSLOduration=2.816717234 podStartE2EDuration="5.210337641s" podCreationTimestamp="2026-03-08 00:32:55 +0000 UTC" firstStartedPulling="2026-03-08 00:32:57.166241607 +0000 UTC m=+1631.285873840" lastFinishedPulling="2026-03-08 00:32:59.559862014 +0000 UTC m=+1633.679494247" observedRunningTime="2026-03-08 00:33:00.209330134 +0000 UTC m=+1634.328962397" watchObservedRunningTime="2026-03-08 00:33:00.210337641 +0000 UTC m=+1634.329969874" Mar 08 00:33:00 crc kubenswrapper[4713]: I0308 00:33:00.559361 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8"] Mar 08 00:33:00 crc kubenswrapper[4713]: W0308 00:33:00.559432 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3468be8a_1655_46bd_869e_a1f4653984f1.slice/crio-367b7c56b758b0fc0049526f46484e41a79cb6b57cc79d878998fb75f5545b9b WatchSource:0}: Error finding container 367b7c56b758b0fc0049526f46484e41a79cb6b57cc79d878998fb75f5545b9b: Status 404 returned error can't find the container with id 367b7c56b758b0fc0049526f46484e41a79cb6b57cc79d878998fb75f5545b9b Mar 08 00:33:00 crc kubenswrapper[4713]: I0308 00:33:00.831252 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h"] Mar 08 00:33:00 crc kubenswrapper[4713]: I0308 00:33:00.832804 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" Mar 08 00:33:00 crc kubenswrapper[4713]: I0308 00:33:00.843552 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h"] Mar 08 00:33:00 crc kubenswrapper[4713]: I0308 00:33:00.885703 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h\" (UID: \"e47eaf9e-75d1-40eb-8671-0ebc9ca47520\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" Mar 08 00:33:00 crc kubenswrapper[4713]: I0308 00:33:00.885747 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h\" (UID: \"e47eaf9e-75d1-40eb-8671-0ebc9ca47520\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" Mar 08 00:33:00 crc kubenswrapper[4713]: I0308 00:33:00.885770 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm5dp\" (UniqueName: \"kubernetes.io/projected/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-kube-api-access-sm5dp\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h\" (UID: \"e47eaf9e-75d1-40eb-8671-0ebc9ca47520\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" Mar 08 00:33:00 crc kubenswrapper[4713]: I0308 00:33:00.987555 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h\" (UID: \"e47eaf9e-75d1-40eb-8671-0ebc9ca47520\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" Mar 08 00:33:00 crc kubenswrapper[4713]: I0308 00:33:00.987627 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h\" (UID: \"e47eaf9e-75d1-40eb-8671-0ebc9ca47520\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" Mar 08 00:33:00 crc kubenswrapper[4713]: I0308 00:33:00.987658 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm5dp\" (UniqueName: \"kubernetes.io/projected/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-kube-api-access-sm5dp\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h\" (UID: \"e47eaf9e-75d1-40eb-8671-0ebc9ca47520\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" Mar 08 00:33:00 crc kubenswrapper[4713]: I0308 00:33:00.988106 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h\" (UID: \"e47eaf9e-75d1-40eb-8671-0ebc9ca47520\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" Mar 08 00:33:00 crc kubenswrapper[4713]: I0308 00:33:00.988323 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h\" (UID: \"e47eaf9e-75d1-40eb-8671-0ebc9ca47520\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" Mar 08 00:33:01 crc kubenswrapper[4713]: I0308 00:33:01.014840 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm5dp\" (UniqueName: \"kubernetes.io/projected/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-kube-api-access-sm5dp\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h\" (UID: \"e47eaf9e-75d1-40eb-8671-0ebc9ca47520\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" Mar 08 00:33:01 crc kubenswrapper[4713]: I0308 00:33:01.186286 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" Mar 08 00:33:01 crc kubenswrapper[4713]: I0308 00:33:01.213468 4713 generic.go:334] "Generic (PLEG): container finished" podID="3468be8a-1655-46bd-869e-a1f4653984f1" containerID="add7fb42d76faa64b6aeb50aea2a78e03bf22d54af5bbda2ea40f47b22633147" exitCode=0 Mar 08 00:33:01 crc kubenswrapper[4713]: I0308 00:33:01.214240 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" event={"ID":"3468be8a-1655-46bd-869e-a1f4653984f1","Type":"ContainerDied","Data":"add7fb42d76faa64b6aeb50aea2a78e03bf22d54af5bbda2ea40f47b22633147"} Mar 08 00:33:01 crc kubenswrapper[4713]: I0308 00:33:01.214292 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" event={"ID":"3468be8a-1655-46bd-869e-a1f4653984f1","Type":"ContainerStarted","Data":"367b7c56b758b0fc0049526f46484e41a79cb6b57cc79d878998fb75f5545b9b"} Mar 08 00:33:01 crc kubenswrapper[4713]: I0308 00:33:01.218472 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 00:33:01 crc kubenswrapper[4713]: I0308 00:33:01.394720 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h"] Mar 08 00:33:01 crc kubenswrapper[4713]: W0308 00:33:01.486338 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode47eaf9e_75d1_40eb_8671_0ebc9ca47520.slice/crio-70f10667c84b68b5828294224680e99f6a19bfaaa95456d3550eb940b7ea4652 WatchSource:0}: Error finding container 70f10667c84b68b5828294224680e99f6a19bfaaa95456d3550eb940b7ea4652: Status 404 returned error can't find the container with id 70f10667c84b68b5828294224680e99f6a19bfaaa95456d3550eb940b7ea4652 Mar 08 00:33:02 crc kubenswrapper[4713]: I0308 00:33:02.221944 4713 generic.go:334] "Generic (PLEG): container finished" podID="3468be8a-1655-46bd-869e-a1f4653984f1" containerID="a43b41dbd861e871f5b111d9733e70466dca8e98a32ae0a6001284be33a60d23" exitCode=0 Mar 08 00:33:02 crc kubenswrapper[4713]: I0308 00:33:02.222293 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" event={"ID":"3468be8a-1655-46bd-869e-a1f4653984f1","Type":"ContainerDied","Data":"a43b41dbd861e871f5b111d9733e70466dca8e98a32ae0a6001284be33a60d23"} Mar 08 00:33:02 crc kubenswrapper[4713]: I0308 00:33:02.226368 4713 generic.go:334] "Generic (PLEG): container finished" podID="e47eaf9e-75d1-40eb-8671-0ebc9ca47520" containerID="05d0b9d876c6f453a46a6ed447b8a8ce4f6de5109efd4222aac4fb2454f59dbc" exitCode=0 Mar 08 00:33:02 crc kubenswrapper[4713]: I0308 00:33:02.226414 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" event={"ID":"e47eaf9e-75d1-40eb-8671-0ebc9ca47520","Type":"ContainerDied","Data":"05d0b9d876c6f453a46a6ed447b8a8ce4f6de5109efd4222aac4fb2454f59dbc"} Mar 08 00:33:02 crc kubenswrapper[4713]: I0308 00:33:02.226433 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" event={"ID":"e47eaf9e-75d1-40eb-8671-0ebc9ca47520","Type":"ContainerStarted","Data":"70f10667c84b68b5828294224680e99f6a19bfaaa95456d3550eb940b7ea4652"} Mar 08 00:33:03 crc kubenswrapper[4713]: I0308 00:33:03.240276 4713 generic.go:334] "Generic (PLEG): container finished" podID="3468be8a-1655-46bd-869e-a1f4653984f1" containerID="5012414fcce13bcdc53ece6374103d9afa70ccdbaf616a7e5f092f2167af931a" exitCode=0 Mar 08 00:33:03 crc kubenswrapper[4713]: I0308 00:33:03.240345 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" event={"ID":"3468be8a-1655-46bd-869e-a1f4653984f1","Type":"ContainerDied","Data":"5012414fcce13bcdc53ece6374103d9afa70ccdbaf616a7e5f092f2167af931a"} Mar 08 00:33:03 crc kubenswrapper[4713]: I0308 00:33:03.244762 4713 generic.go:334] "Generic (PLEG): container finished" podID="e47eaf9e-75d1-40eb-8671-0ebc9ca47520" containerID="50c3f41e381cc6a47b78b7dc7f983678a1c1769067c43239a6cb4196bfbab5c2" exitCode=0 Mar 08 00:33:03 crc kubenswrapper[4713]: I0308 00:33:03.244804 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" event={"ID":"e47eaf9e-75d1-40eb-8671-0ebc9ca47520","Type":"ContainerDied","Data":"50c3f41e381cc6a47b78b7dc7f983678a1c1769067c43239a6cb4196bfbab5c2"} Mar 08 00:33:04 crc kubenswrapper[4713]: I0308 00:33:04.252173 4713 generic.go:334] "Generic (PLEG): container finished" podID="e47eaf9e-75d1-40eb-8671-0ebc9ca47520" containerID="66c274bef3a35763566cb4084532025a581f2bf1d99a6c94c69c7883f0d852dd" exitCode=0 Mar 08 00:33:04 crc kubenswrapper[4713]: I0308 00:33:04.252381 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" event={"ID":"e47eaf9e-75d1-40eb-8671-0ebc9ca47520","Type":"ContainerDied","Data":"66c274bef3a35763566cb4084532025a581f2bf1d99a6c94c69c7883f0d852dd"} Mar 08 00:33:04 crc kubenswrapper[4713]: I0308 00:33:04.475585 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" Mar 08 00:33:04 crc kubenswrapper[4713]: I0308 00:33:04.500659 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:33:04 crc kubenswrapper[4713]: I0308 00:33:04.500725 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:33:04 crc kubenswrapper[4713]: I0308 00:33:04.539525 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3468be8a-1655-46bd-869e-a1f4653984f1-bundle\") pod \"3468be8a-1655-46bd-869e-a1f4653984f1\" (UID: \"3468be8a-1655-46bd-869e-a1f4653984f1\") " Mar 08 00:33:04 crc kubenswrapper[4713]: I0308 00:33:04.540879 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3468be8a-1655-46bd-869e-a1f4653984f1-bundle" (OuterVolumeSpecName: "bundle") pod "3468be8a-1655-46bd-869e-a1f4653984f1" (UID: "3468be8a-1655-46bd-869e-a1f4653984f1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:33:04 crc kubenswrapper[4713]: I0308 00:33:04.640344 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz8k4\" (UniqueName: \"kubernetes.io/projected/3468be8a-1655-46bd-869e-a1f4653984f1-kube-api-access-pz8k4\") pod \"3468be8a-1655-46bd-869e-a1f4653984f1\" (UID: \"3468be8a-1655-46bd-869e-a1f4653984f1\") " Mar 08 00:33:04 crc kubenswrapper[4713]: I0308 00:33:04.640399 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3468be8a-1655-46bd-869e-a1f4653984f1-util\") pod \"3468be8a-1655-46bd-869e-a1f4653984f1\" (UID: \"3468be8a-1655-46bd-869e-a1f4653984f1\") " Mar 08 00:33:04 crc kubenswrapper[4713]: I0308 00:33:04.640802 4713 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3468be8a-1655-46bd-869e-a1f4653984f1-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 00:33:04 crc kubenswrapper[4713]: I0308 00:33:04.646105 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3468be8a-1655-46bd-869e-a1f4653984f1-kube-api-access-pz8k4" (OuterVolumeSpecName: "kube-api-access-pz8k4") pod "3468be8a-1655-46bd-869e-a1f4653984f1" (UID: "3468be8a-1655-46bd-869e-a1f4653984f1"). InnerVolumeSpecName "kube-api-access-pz8k4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:33:04 crc kubenswrapper[4713]: I0308 00:33:04.661075 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3468be8a-1655-46bd-869e-a1f4653984f1-util" (OuterVolumeSpecName: "util") pod "3468be8a-1655-46bd-869e-a1f4653984f1" (UID: "3468be8a-1655-46bd-869e-a1f4653984f1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:33:04 crc kubenswrapper[4713]: I0308 00:33:04.742052 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pz8k4\" (UniqueName: \"kubernetes.io/projected/3468be8a-1655-46bd-869e-a1f4653984f1-kube-api-access-pz8k4\") on node \"crc\" DevicePath \"\"" Mar 08 00:33:04 crc kubenswrapper[4713]: I0308 00:33:04.742288 4713 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3468be8a-1655-46bd-869e-a1f4653984f1-util\") on node \"crc\" DevicePath \"\"" Mar 08 00:33:05 crc kubenswrapper[4713]: I0308 00:33:05.266272 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" event={"ID":"3468be8a-1655-46bd-869e-a1f4653984f1","Type":"ContainerDied","Data":"367b7c56b758b0fc0049526f46484e41a79cb6b57cc79d878998fb75f5545b9b"} Mar 08 00:33:05 crc kubenswrapper[4713]: I0308 00:33:05.266333 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="367b7c56b758b0fc0049526f46484e41a79cb6b57cc79d878998fb75f5545b9b" Mar 08 00:33:05 crc kubenswrapper[4713]: I0308 00:33:05.266397 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" Mar 08 00:33:05 crc kubenswrapper[4713]: I0308 00:33:05.534292 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" Mar 08 00:33:05 crc kubenswrapper[4713]: I0308 00:33:05.674808 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-util\") pod \"e47eaf9e-75d1-40eb-8671-0ebc9ca47520\" (UID: \"e47eaf9e-75d1-40eb-8671-0ebc9ca47520\") " Mar 08 00:33:05 crc kubenswrapper[4713]: I0308 00:33:05.675096 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-bundle\") pod \"e47eaf9e-75d1-40eb-8671-0ebc9ca47520\" (UID: \"e47eaf9e-75d1-40eb-8671-0ebc9ca47520\") " Mar 08 00:33:05 crc kubenswrapper[4713]: I0308 00:33:05.675181 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sm5dp\" (UniqueName: \"kubernetes.io/projected/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-kube-api-access-sm5dp\") pod \"e47eaf9e-75d1-40eb-8671-0ebc9ca47520\" (UID: \"e47eaf9e-75d1-40eb-8671-0ebc9ca47520\") " Mar 08 00:33:05 crc kubenswrapper[4713]: I0308 00:33:05.675585 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-bundle" (OuterVolumeSpecName: "bundle") pod "e47eaf9e-75d1-40eb-8671-0ebc9ca47520" (UID: "e47eaf9e-75d1-40eb-8671-0ebc9ca47520"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:33:05 crc kubenswrapper[4713]: I0308 00:33:05.685006 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-kube-api-access-sm5dp" (OuterVolumeSpecName: "kube-api-access-sm5dp") pod "e47eaf9e-75d1-40eb-8671-0ebc9ca47520" (UID: "e47eaf9e-75d1-40eb-8671-0ebc9ca47520"). InnerVolumeSpecName "kube-api-access-sm5dp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:33:05 crc kubenswrapper[4713]: I0308 00:33:05.693798 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-util" (OuterVolumeSpecName: "util") pod "e47eaf9e-75d1-40eb-8671-0ebc9ca47520" (UID: "e47eaf9e-75d1-40eb-8671-0ebc9ca47520"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:33:05 crc kubenswrapper[4713]: I0308 00:33:05.776985 4713 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-util\") on node \"crc\" DevicePath \"\"" Mar 08 00:33:05 crc kubenswrapper[4713]: I0308 00:33:05.777018 4713 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 00:33:05 crc kubenswrapper[4713]: I0308 00:33:05.777031 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sm5dp\" (UniqueName: \"kubernetes.io/projected/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-kube-api-access-sm5dp\") on node \"crc\" DevicePath \"\"" Mar 08 00:33:06 crc kubenswrapper[4713]: I0308 00:33:06.112960 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:33:06 crc kubenswrapper[4713]: I0308 00:33:06.113013 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:33:06 crc kubenswrapper[4713]: I0308 00:33:06.153691 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:33:06 crc kubenswrapper[4713]: I0308 00:33:06.276323 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" Mar 08 00:33:06 crc kubenswrapper[4713]: I0308 00:33:06.276350 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" event={"ID":"e47eaf9e-75d1-40eb-8671-0ebc9ca47520","Type":"ContainerDied","Data":"70f10667c84b68b5828294224680e99f6a19bfaaa95456d3550eb940b7ea4652"} Mar 08 00:33:06 crc kubenswrapper[4713]: I0308 00:33:06.276420 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70f10667c84b68b5828294224680e99f6a19bfaaa95456d3550eb940b7ea4652" Mar 08 00:33:06 crc kubenswrapper[4713]: I0308 00:33:06.317595 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:33:08 crc kubenswrapper[4713]: I0308 00:33:08.367550 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-svj4c"] Mar 08 00:33:08 crc kubenswrapper[4713]: I0308 00:33:08.368049 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-svj4c" podUID="dca9cde6-7c79-47bd-aacc-d326268e5595" containerName="registry-server" containerID="cri-o://ee7835fb72e5d8baa1fd18b584ef3b9ccbe7fd71ca43e706485e4854b1e7b19b" gracePeriod=2 Mar 08 00:33:08 crc kubenswrapper[4713]: I0308 00:33:08.699965 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:33:08 crc kubenswrapper[4713]: I0308 00:33:08.821390 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dca9cde6-7c79-47bd-aacc-d326268e5595-catalog-content\") pod \"dca9cde6-7c79-47bd-aacc-d326268e5595\" (UID: \"dca9cde6-7c79-47bd-aacc-d326268e5595\") " Mar 08 00:33:08 crc kubenswrapper[4713]: I0308 00:33:08.821550 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dca9cde6-7c79-47bd-aacc-d326268e5595-utilities\") pod \"dca9cde6-7c79-47bd-aacc-d326268e5595\" (UID: \"dca9cde6-7c79-47bd-aacc-d326268e5595\") " Mar 08 00:33:08 crc kubenswrapper[4713]: I0308 00:33:08.821582 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xw25\" (UniqueName: \"kubernetes.io/projected/dca9cde6-7c79-47bd-aacc-d326268e5595-kube-api-access-5xw25\") pod \"dca9cde6-7c79-47bd-aacc-d326268e5595\" (UID: \"dca9cde6-7c79-47bd-aacc-d326268e5595\") " Mar 08 00:33:08 crc kubenswrapper[4713]: I0308 00:33:08.822341 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dca9cde6-7c79-47bd-aacc-d326268e5595-utilities" (OuterVolumeSpecName: "utilities") pod "dca9cde6-7c79-47bd-aacc-d326268e5595" (UID: "dca9cde6-7c79-47bd-aacc-d326268e5595"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:33:08 crc kubenswrapper[4713]: I0308 00:33:08.822787 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dca9cde6-7c79-47bd-aacc-d326268e5595-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:33:08 crc kubenswrapper[4713]: I0308 00:33:08.827096 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dca9cde6-7c79-47bd-aacc-d326268e5595-kube-api-access-5xw25" (OuterVolumeSpecName: "kube-api-access-5xw25") pod "dca9cde6-7c79-47bd-aacc-d326268e5595" (UID: "dca9cde6-7c79-47bd-aacc-d326268e5595"). InnerVolumeSpecName "kube-api-access-5xw25". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:33:08 crc kubenswrapper[4713]: I0308 00:33:08.880129 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dca9cde6-7c79-47bd-aacc-d326268e5595-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dca9cde6-7c79-47bd-aacc-d326268e5595" (UID: "dca9cde6-7c79-47bd-aacc-d326268e5595"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:33:08 crc kubenswrapper[4713]: I0308 00:33:08.924047 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xw25\" (UniqueName: \"kubernetes.io/projected/dca9cde6-7c79-47bd-aacc-d326268e5595-kube-api-access-5xw25\") on node \"crc\" DevicePath \"\"" Mar 08 00:33:08 crc kubenswrapper[4713]: I0308 00:33:08.924076 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dca9cde6-7c79-47bd-aacc-d326268e5595-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:33:09 crc kubenswrapper[4713]: I0308 00:33:09.295997 4713 generic.go:334] "Generic (PLEG): container finished" podID="dca9cde6-7c79-47bd-aacc-d326268e5595" containerID="ee7835fb72e5d8baa1fd18b584ef3b9ccbe7fd71ca43e706485e4854b1e7b19b" exitCode=0 Mar 08 00:33:09 crc kubenswrapper[4713]: I0308 00:33:09.296074 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svj4c" event={"ID":"dca9cde6-7c79-47bd-aacc-d326268e5595","Type":"ContainerDied","Data":"ee7835fb72e5d8baa1fd18b584ef3b9ccbe7fd71ca43e706485e4854b1e7b19b"} Mar 08 00:33:09 crc kubenswrapper[4713]: I0308 00:33:09.296292 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svj4c" event={"ID":"dca9cde6-7c79-47bd-aacc-d326268e5595","Type":"ContainerDied","Data":"9f82c3dff1939e1e71812e7fa1c087d46f2fb778390ac98acb85ef0038e4d1b2"} Mar 08 00:33:09 crc kubenswrapper[4713]: I0308 00:33:09.296312 4713 scope.go:117] "RemoveContainer" containerID="ee7835fb72e5d8baa1fd18b584ef3b9ccbe7fd71ca43e706485e4854b1e7b19b" Mar 08 00:33:09 crc kubenswrapper[4713]: I0308 00:33:09.296084 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:33:09 crc kubenswrapper[4713]: I0308 00:33:09.330876 4713 scope.go:117] "RemoveContainer" containerID="ca2ec636ba7fe1f82e7f359ccdbed3ec39b94744bbafbc17519b011d5fb3967a" Mar 08 00:33:09 crc kubenswrapper[4713]: I0308 00:33:09.334762 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-svj4c"] Mar 08 00:33:09 crc kubenswrapper[4713]: I0308 00:33:09.339787 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-svj4c"] Mar 08 00:33:09 crc kubenswrapper[4713]: I0308 00:33:09.358803 4713 scope.go:117] "RemoveContainer" containerID="99596bdc3f5d694747953229651ae0f0cae64257f60bc1a5f0c511f98c9e4785" Mar 08 00:33:09 crc kubenswrapper[4713]: I0308 00:33:09.373679 4713 scope.go:117] "RemoveContainer" containerID="ee7835fb72e5d8baa1fd18b584ef3b9ccbe7fd71ca43e706485e4854b1e7b19b" Mar 08 00:33:09 crc kubenswrapper[4713]: E0308 00:33:09.374182 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee7835fb72e5d8baa1fd18b584ef3b9ccbe7fd71ca43e706485e4854b1e7b19b\": container with ID starting with ee7835fb72e5d8baa1fd18b584ef3b9ccbe7fd71ca43e706485e4854b1e7b19b not found: ID does not exist" containerID="ee7835fb72e5d8baa1fd18b584ef3b9ccbe7fd71ca43e706485e4854b1e7b19b" Mar 08 00:33:09 crc kubenswrapper[4713]: I0308 00:33:09.374227 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee7835fb72e5d8baa1fd18b584ef3b9ccbe7fd71ca43e706485e4854b1e7b19b"} err="failed to get container status \"ee7835fb72e5d8baa1fd18b584ef3b9ccbe7fd71ca43e706485e4854b1e7b19b\": rpc error: code = NotFound desc = could not find container \"ee7835fb72e5d8baa1fd18b584ef3b9ccbe7fd71ca43e706485e4854b1e7b19b\": container with ID starting with ee7835fb72e5d8baa1fd18b584ef3b9ccbe7fd71ca43e706485e4854b1e7b19b not found: ID does not exist" Mar 08 00:33:09 crc kubenswrapper[4713]: I0308 00:33:09.374259 4713 scope.go:117] "RemoveContainer" containerID="ca2ec636ba7fe1f82e7f359ccdbed3ec39b94744bbafbc17519b011d5fb3967a" Mar 08 00:33:09 crc kubenswrapper[4713]: E0308 00:33:09.375404 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca2ec636ba7fe1f82e7f359ccdbed3ec39b94744bbafbc17519b011d5fb3967a\": container with ID starting with ca2ec636ba7fe1f82e7f359ccdbed3ec39b94744bbafbc17519b011d5fb3967a not found: ID does not exist" containerID="ca2ec636ba7fe1f82e7f359ccdbed3ec39b94744bbafbc17519b011d5fb3967a" Mar 08 00:33:09 crc kubenswrapper[4713]: I0308 00:33:09.375448 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca2ec636ba7fe1f82e7f359ccdbed3ec39b94744bbafbc17519b011d5fb3967a"} err="failed to get container status \"ca2ec636ba7fe1f82e7f359ccdbed3ec39b94744bbafbc17519b011d5fb3967a\": rpc error: code = NotFound desc = could not find container \"ca2ec636ba7fe1f82e7f359ccdbed3ec39b94744bbafbc17519b011d5fb3967a\": container with ID starting with ca2ec636ba7fe1f82e7f359ccdbed3ec39b94744bbafbc17519b011d5fb3967a not found: ID does not exist" Mar 08 00:33:09 crc kubenswrapper[4713]: I0308 00:33:09.375473 4713 scope.go:117] "RemoveContainer" containerID="99596bdc3f5d694747953229651ae0f0cae64257f60bc1a5f0c511f98c9e4785" Mar 08 00:33:09 crc kubenswrapper[4713]: E0308 00:33:09.375714 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99596bdc3f5d694747953229651ae0f0cae64257f60bc1a5f0c511f98c9e4785\": container with ID starting with 99596bdc3f5d694747953229651ae0f0cae64257f60bc1a5f0c511f98c9e4785 not found: ID does not exist" containerID="99596bdc3f5d694747953229651ae0f0cae64257f60bc1a5f0c511f98c9e4785" Mar 08 00:33:09 crc kubenswrapper[4713]: I0308 00:33:09.375748 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99596bdc3f5d694747953229651ae0f0cae64257f60bc1a5f0c511f98c9e4785"} err="failed to get container status \"99596bdc3f5d694747953229651ae0f0cae64257f60bc1a5f0c511f98c9e4785\": rpc error: code = NotFound desc = could not find container \"99596bdc3f5d694747953229651ae0f0cae64257f60bc1a5f0c511f98c9e4785\": container with ID starting with 99596bdc3f5d694747953229651ae0f0cae64257f60bc1a5f0c511f98c9e4785 not found: ID does not exist" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.549131 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dca9cde6-7c79-47bd-aacc-d326268e5595" path="/var/lib/kubelet/pods/dca9cde6-7c79-47bd-aacc-d326268e5595/volumes" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.908764 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-795859486c-d7k9q"] Mar 08 00:33:10 crc kubenswrapper[4713]: E0308 00:33:10.909263 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dca9cde6-7c79-47bd-aacc-d326268e5595" containerName="extract-content" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.909275 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="dca9cde6-7c79-47bd-aacc-d326268e5595" containerName="extract-content" Mar 08 00:33:10 crc kubenswrapper[4713]: E0308 00:33:10.909283 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e47eaf9e-75d1-40eb-8671-0ebc9ca47520" containerName="pull" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.909289 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e47eaf9e-75d1-40eb-8671-0ebc9ca47520" containerName="pull" Mar 08 00:33:10 crc kubenswrapper[4713]: E0308 00:33:10.909298 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e47eaf9e-75d1-40eb-8671-0ebc9ca47520" containerName="extract" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.909305 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e47eaf9e-75d1-40eb-8671-0ebc9ca47520" containerName="extract" Mar 08 00:33:10 crc kubenswrapper[4713]: E0308 00:33:10.909316 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dca9cde6-7c79-47bd-aacc-d326268e5595" containerName="extract-utilities" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.909323 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="dca9cde6-7c79-47bd-aacc-d326268e5595" containerName="extract-utilities" Mar 08 00:33:10 crc kubenswrapper[4713]: E0308 00:33:10.909334 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dca9cde6-7c79-47bd-aacc-d326268e5595" containerName="registry-server" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.909339 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="dca9cde6-7c79-47bd-aacc-d326268e5595" containerName="registry-server" Mar 08 00:33:10 crc kubenswrapper[4713]: E0308 00:33:10.909348 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3468be8a-1655-46bd-869e-a1f4653984f1" containerName="util" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.909353 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="3468be8a-1655-46bd-869e-a1f4653984f1" containerName="util" Mar 08 00:33:10 crc kubenswrapper[4713]: E0308 00:33:10.909365 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3468be8a-1655-46bd-869e-a1f4653984f1" containerName="pull" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.909371 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="3468be8a-1655-46bd-869e-a1f4653984f1" containerName="pull" Mar 08 00:33:10 crc kubenswrapper[4713]: E0308 00:33:10.909380 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e47eaf9e-75d1-40eb-8671-0ebc9ca47520" containerName="util" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.909386 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e47eaf9e-75d1-40eb-8671-0ebc9ca47520" containerName="util" Mar 08 00:33:10 crc kubenswrapper[4713]: E0308 00:33:10.909395 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3468be8a-1655-46bd-869e-a1f4653984f1" containerName="extract" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.909400 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="3468be8a-1655-46bd-869e-a1f4653984f1" containerName="extract" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.909493 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="e47eaf9e-75d1-40eb-8671-0ebc9ca47520" containerName="extract" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.909509 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="dca9cde6-7c79-47bd-aacc-d326268e5595" containerName="registry-server" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.909517 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="3468be8a-1655-46bd-869e-a1f4653984f1" containerName="extract" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.909898 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-795859486c-d7k9q" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.912428 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-operator-dockercfg-h6xd8" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.949247 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh5fj\" (UniqueName: \"kubernetes.io/projected/934a7934-e52f-4279-9c2a-4255daf78d5a-kube-api-access-lh5fj\") pod \"smart-gateway-operator-795859486c-d7k9q\" (UID: \"934a7934-e52f-4279-9c2a-4255daf78d5a\") " pod="service-telemetry/smart-gateway-operator-795859486c-d7k9q" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.949407 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/934a7934-e52f-4279-9c2a-4255daf78d5a-runner\") pod \"smart-gateway-operator-795859486c-d7k9q\" (UID: \"934a7934-e52f-4279-9c2a-4255daf78d5a\") " pod="service-telemetry/smart-gateway-operator-795859486c-d7k9q" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.959961 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-795859486c-d7k9q"] Mar 08 00:33:11 crc kubenswrapper[4713]: I0308 00:33:11.050080 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh5fj\" (UniqueName: \"kubernetes.io/projected/934a7934-e52f-4279-9c2a-4255daf78d5a-kube-api-access-lh5fj\") pod \"smart-gateway-operator-795859486c-d7k9q\" (UID: \"934a7934-e52f-4279-9c2a-4255daf78d5a\") " pod="service-telemetry/smart-gateway-operator-795859486c-d7k9q" Mar 08 00:33:11 crc kubenswrapper[4713]: I0308 00:33:11.050164 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/934a7934-e52f-4279-9c2a-4255daf78d5a-runner\") pod \"smart-gateway-operator-795859486c-d7k9q\" (UID: \"934a7934-e52f-4279-9c2a-4255daf78d5a\") " pod="service-telemetry/smart-gateway-operator-795859486c-d7k9q" Mar 08 00:33:11 crc kubenswrapper[4713]: I0308 00:33:11.050636 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/934a7934-e52f-4279-9c2a-4255daf78d5a-runner\") pod \"smart-gateway-operator-795859486c-d7k9q\" (UID: \"934a7934-e52f-4279-9c2a-4255daf78d5a\") " pod="service-telemetry/smart-gateway-operator-795859486c-d7k9q" Mar 08 00:33:11 crc kubenswrapper[4713]: I0308 00:33:11.075734 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh5fj\" (UniqueName: \"kubernetes.io/projected/934a7934-e52f-4279-9c2a-4255daf78d5a-kube-api-access-lh5fj\") pod \"smart-gateway-operator-795859486c-d7k9q\" (UID: \"934a7934-e52f-4279-9c2a-4255daf78d5a\") " pod="service-telemetry/smart-gateway-operator-795859486c-d7k9q" Mar 08 00:33:11 crc kubenswrapper[4713]: I0308 00:33:11.229366 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-795859486c-d7k9q" Mar 08 00:33:11 crc kubenswrapper[4713]: I0308 00:33:11.676206 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-795859486c-d7k9q"] Mar 08 00:33:11 crc kubenswrapper[4713]: W0308 00:33:11.681760 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod934a7934_e52f_4279_9c2a_4255daf78d5a.slice/crio-134b6af8978f8e78c7363be5f8f154fa3f965c441c70a1c2db665a8a04a79dd6 WatchSource:0}: Error finding container 134b6af8978f8e78c7363be5f8f154fa3f965c441c70a1c2db665a8a04a79dd6: Status 404 returned error can't find the container with id 134b6af8978f8e78c7363be5f8f154fa3f965c441c70a1c2db665a8a04a79dd6 Mar 08 00:33:12 crc kubenswrapper[4713]: I0308 00:33:12.319986 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-795859486c-d7k9q" event={"ID":"934a7934-e52f-4279-9c2a-4255daf78d5a","Type":"ContainerStarted","Data":"134b6af8978f8e78c7363be5f8f154fa3f965c441c70a1c2db665a8a04a79dd6"} Mar 08 00:33:13 crc kubenswrapper[4713]: I0308 00:33:13.911095 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-6f9dc9fb4b-dzbm4"] Mar 08 00:33:13 crc kubenswrapper[4713]: I0308 00:33:13.912001 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-6f9dc9fb4b-dzbm4" Mar 08 00:33:13 crc kubenswrapper[4713]: I0308 00:33:13.912063 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-6f9dc9fb4b-dzbm4"] Mar 08 00:33:13 crc kubenswrapper[4713]: I0308 00:33:13.917280 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-operator-dockercfg-rwbl6" Mar 08 00:33:14 crc kubenswrapper[4713]: I0308 00:33:14.019804 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwwzc\" (UniqueName: \"kubernetes.io/projected/c714eef0-0fe5-4836-80e1-c640aa9527e7-kube-api-access-hwwzc\") pod \"service-telemetry-operator-6f9dc9fb4b-dzbm4\" (UID: \"c714eef0-0fe5-4836-80e1-c640aa9527e7\") " pod="service-telemetry/service-telemetry-operator-6f9dc9fb4b-dzbm4" Mar 08 00:33:14 crc kubenswrapper[4713]: I0308 00:33:14.020172 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/c714eef0-0fe5-4836-80e1-c640aa9527e7-runner\") pod \"service-telemetry-operator-6f9dc9fb4b-dzbm4\" (UID: \"c714eef0-0fe5-4836-80e1-c640aa9527e7\") " pod="service-telemetry/service-telemetry-operator-6f9dc9fb4b-dzbm4" Mar 08 00:33:14 crc kubenswrapper[4713]: I0308 00:33:14.133149 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwwzc\" (UniqueName: \"kubernetes.io/projected/c714eef0-0fe5-4836-80e1-c640aa9527e7-kube-api-access-hwwzc\") pod \"service-telemetry-operator-6f9dc9fb4b-dzbm4\" (UID: \"c714eef0-0fe5-4836-80e1-c640aa9527e7\") " pod="service-telemetry/service-telemetry-operator-6f9dc9fb4b-dzbm4" Mar 08 00:33:14 crc kubenswrapper[4713]: I0308 00:33:14.133525 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/c714eef0-0fe5-4836-80e1-c640aa9527e7-runner\") pod \"service-telemetry-operator-6f9dc9fb4b-dzbm4\" (UID: \"c714eef0-0fe5-4836-80e1-c640aa9527e7\") " pod="service-telemetry/service-telemetry-operator-6f9dc9fb4b-dzbm4" Mar 08 00:33:14 crc kubenswrapper[4713]: I0308 00:33:14.133992 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/c714eef0-0fe5-4836-80e1-c640aa9527e7-runner\") pod \"service-telemetry-operator-6f9dc9fb4b-dzbm4\" (UID: \"c714eef0-0fe5-4836-80e1-c640aa9527e7\") " pod="service-telemetry/service-telemetry-operator-6f9dc9fb4b-dzbm4" Mar 08 00:33:14 crc kubenswrapper[4713]: I0308 00:33:14.159972 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwwzc\" (UniqueName: \"kubernetes.io/projected/c714eef0-0fe5-4836-80e1-c640aa9527e7-kube-api-access-hwwzc\") pod \"service-telemetry-operator-6f9dc9fb4b-dzbm4\" (UID: \"c714eef0-0fe5-4836-80e1-c640aa9527e7\") " pod="service-telemetry/service-telemetry-operator-6f9dc9fb4b-dzbm4" Mar 08 00:33:14 crc kubenswrapper[4713]: I0308 00:33:14.234556 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-6f9dc9fb4b-dzbm4" Mar 08 00:33:14 crc kubenswrapper[4713]: I0308 00:33:14.427779 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-6f9dc9fb4b-dzbm4"] Mar 08 00:33:15 crc kubenswrapper[4713]: I0308 00:33:15.350647 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-6f9dc9fb4b-dzbm4" event={"ID":"c714eef0-0fe5-4836-80e1-c640aa9527e7","Type":"ContainerStarted","Data":"9737dd344b4808d0d70e88f2bd07c93cf44ed1c47bfa5c875aa40d32d36f57e3"} Mar 08 00:33:27 crc kubenswrapper[4713]: E0308 00:33:27.466938 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/infrawatch/smart-gateway-operator:stable-1.5" Mar 08 00:33:27 crc kubenswrapper[4713]: E0308 00:33:27.467717 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/infrawatch/smart-gateway-operator:stable-1.5,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.annotations['olm.targetNamespaces'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:smart-gateway-operator,ValueFrom:nil,},EnvVar{Name:ANSIBLE_GATHERING,Value:explicit,ValueFrom:nil,},EnvVar{Name:ANSIBLE_VERBOSITY_SMARTGATEWAY_SMARTGATEWAY_INFRA_WATCH,Value:4,ValueFrom:nil,},EnvVar{Name:ANSIBLE_DEBUG_LOGS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CORE_SMARTGATEWAY_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BRIDGE_SMARTGATEWAY_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-bridge:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OAUTH_PROXY_IMAGE,Value:quay.io/openshift/origin-oauth-proxy:latest,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:smart-gateway-operator.v5.0.1772929848,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:runner,ReadOnly:false,MountPath:/tmp/ansible-operator/runner,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lh5fj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod smart-gateway-operator-795859486c-d7k9q_service-telemetry(934a7934-e52f-4279-9c2a-4255daf78d5a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 00:33:27 crc kubenswrapper[4713]: E0308 00:33:27.468926 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/smart-gateway-operator-795859486c-d7k9q" podUID="934a7934-e52f-4279-9c2a-4255daf78d5a" Mar 08 00:33:28 crc kubenswrapper[4713]: E0308 00:33:28.458185 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/infrawatch/smart-gateway-operator:stable-1.5\\\"\"" pod="service-telemetry/smart-gateway-operator-795859486c-d7k9q" podUID="934a7934-e52f-4279-9c2a-4255daf78d5a" Mar 08 00:33:33 crc kubenswrapper[4713]: I0308 00:33:33.498296 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-6f9dc9fb4b-dzbm4" event={"ID":"c714eef0-0fe5-4836-80e1-c640aa9527e7","Type":"ContainerStarted","Data":"a6d41f541dc39b5caf5a4c1055633279153e411158c2089867c891f8a910d42f"} Mar 08 00:33:33 crc kubenswrapper[4713]: I0308 00:33:33.516396 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-6f9dc9fb4b-dzbm4" podStartSLOduration=2.542619887 podStartE2EDuration="20.516377944s" podCreationTimestamp="2026-03-08 00:33:13 +0000 UTC" firstStartedPulling="2026-03-08 00:33:14.434716931 +0000 UTC m=+1648.554349164" lastFinishedPulling="2026-03-08 00:33:32.408474988 +0000 UTC m=+1666.528107221" observedRunningTime="2026-03-08 00:33:33.515807379 +0000 UTC m=+1667.635439632" watchObservedRunningTime="2026-03-08 00:33:33.516377944 +0000 UTC m=+1667.636010167" Mar 08 00:33:34 crc kubenswrapper[4713]: I0308 00:33:34.500305 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:33:34 crc kubenswrapper[4713]: I0308 00:33:34.500377 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:33:34 crc kubenswrapper[4713]: I0308 00:33:34.500429 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:33:34 crc kubenswrapper[4713]: I0308 00:33:34.501149 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b"} pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 00:33:34 crc kubenswrapper[4713]: I0308 00:33:34.501209 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" containerID="cri-o://013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" gracePeriod=600 Mar 08 00:33:34 crc kubenswrapper[4713]: E0308 00:33:34.618742 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:33:35 crc kubenswrapper[4713]: I0308 00:33:35.523955 4713 generic.go:334] "Generic (PLEG): container finished" podID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" exitCode=0 Mar 08 00:33:35 crc kubenswrapper[4713]: I0308 00:33:35.524391 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerDied","Data":"013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b"} Mar 08 00:33:35 crc kubenswrapper[4713]: I0308 00:33:35.524428 4713 scope.go:117] "RemoveContainer" containerID="bbcc55077b8279f43ab1318272be3487b4b4457dea7182ea0e9d79f49619de4c" Mar 08 00:33:35 crc kubenswrapper[4713]: I0308 00:33:35.524960 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:33:35 crc kubenswrapper[4713]: E0308 00:33:35.525175 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:33:42 crc kubenswrapper[4713]: I0308 00:33:42.570108 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-795859486c-d7k9q" event={"ID":"934a7934-e52f-4279-9c2a-4255daf78d5a","Type":"ContainerStarted","Data":"55d59f49b25ee77591ecf1d954ac0737b918ba0688322fb82ae0f4139f4d3519"} Mar 08 00:33:42 crc kubenswrapper[4713]: I0308 00:33:42.596470 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-795859486c-d7k9q" podStartSLOduration=2.213295157 podStartE2EDuration="32.596443628s" podCreationTimestamp="2026-03-08 00:33:10 +0000 UTC" firstStartedPulling="2026-03-08 00:33:11.683561097 +0000 UTC m=+1645.803193330" lastFinishedPulling="2026-03-08 00:33:42.066709568 +0000 UTC m=+1676.186341801" observedRunningTime="2026-03-08 00:33:42.589416041 +0000 UTC m=+1676.709048274" watchObservedRunningTime="2026-03-08 00:33:42.596443628 +0000 UTC m=+1676.716075871" Mar 08 00:33:46 crc kubenswrapper[4713]: I0308 00:33:46.544286 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:33:46 crc kubenswrapper[4713]: E0308 00:33:46.544782 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.439164 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-t7lzv"] Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.440607 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.442414 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-credentials" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.443096 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-dockercfg-8dc86" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.443097 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-users" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.443262 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-credentials" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.443869 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-ca" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.445791 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-interconnect-sasl-config" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.447606 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-ca" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.458558 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-t7lzv"] Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.568804 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-sasl-users\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.568902 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.568929 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/52ed2487-d016-4930-a9ec-98500bfc0db3-sasl-config\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.568978 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkmlb\" (UniqueName: \"kubernetes.io/projected/52ed2487-d016-4930-a9ec-98500bfc0db3-kube-api-access-rkmlb\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.569148 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.569197 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.569280 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.670120 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-sasl-users\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.670168 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.670190 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/52ed2487-d016-4930-a9ec-98500bfc0db3-sasl-config\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.670212 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkmlb\" (UniqueName: \"kubernetes.io/projected/52ed2487-d016-4930-a9ec-98500bfc0db3-kube-api-access-rkmlb\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.670254 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.670271 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.670298 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.671279 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/52ed2487-d016-4930-a9ec-98500bfc0db3-sasl-config\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.676314 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-sasl-users\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.677770 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.685242 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.686716 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.687474 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.703033 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkmlb\" (UniqueName: \"kubernetes.io/projected/52ed2487-d016-4930-a9ec-98500bfc0db3-kube-api-access-rkmlb\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.765203 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:54 crc kubenswrapper[4713]: I0308 00:33:54.211084 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-t7lzv"] Mar 08 00:33:54 crc kubenswrapper[4713]: I0308 00:33:54.650429 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" event={"ID":"52ed2487-d016-4930-a9ec-98500bfc0db3","Type":"ContainerStarted","Data":"71917d86375943e31a9292ae7412991594bcc498f11ed7d30ee0bdc265d89c06"} Mar 08 00:34:00 crc kubenswrapper[4713]: I0308 00:34:00.141182 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548834-njxhh"] Mar 08 00:34:00 crc kubenswrapper[4713]: I0308 00:34:00.142198 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548834-njxhh" Mar 08 00:34:00 crc kubenswrapper[4713]: I0308 00:34:00.144575 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:34:00 crc kubenswrapper[4713]: I0308 00:34:00.144752 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:34:00 crc kubenswrapper[4713]: I0308 00:34:00.144940 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:34:00 crc kubenswrapper[4713]: I0308 00:34:00.151748 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548834-njxhh"] Mar 08 00:34:00 crc kubenswrapper[4713]: I0308 00:34:00.160085 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p7jc\" (UniqueName: \"kubernetes.io/projected/ef90820d-fdcc-4ff1-97db-756e8c96851a-kube-api-access-9p7jc\") pod \"auto-csr-approver-29548834-njxhh\" (UID: \"ef90820d-fdcc-4ff1-97db-756e8c96851a\") " pod="openshift-infra/auto-csr-approver-29548834-njxhh" Mar 08 00:34:00 crc kubenswrapper[4713]: I0308 00:34:00.260864 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p7jc\" (UniqueName: \"kubernetes.io/projected/ef90820d-fdcc-4ff1-97db-756e8c96851a-kube-api-access-9p7jc\") pod \"auto-csr-approver-29548834-njxhh\" (UID: \"ef90820d-fdcc-4ff1-97db-756e8c96851a\") " pod="openshift-infra/auto-csr-approver-29548834-njxhh" Mar 08 00:34:00 crc kubenswrapper[4713]: I0308 00:34:00.291915 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p7jc\" (UniqueName: \"kubernetes.io/projected/ef90820d-fdcc-4ff1-97db-756e8c96851a-kube-api-access-9p7jc\") pod \"auto-csr-approver-29548834-njxhh\" (UID: \"ef90820d-fdcc-4ff1-97db-756e8c96851a\") " pod="openshift-infra/auto-csr-approver-29548834-njxhh" Mar 08 00:34:00 crc kubenswrapper[4713]: I0308 00:34:00.465780 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548834-njxhh" Mar 08 00:34:00 crc kubenswrapper[4713]: I0308 00:34:00.549673 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:34:00 crc kubenswrapper[4713]: E0308 00:34:00.550144 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:34:00 crc kubenswrapper[4713]: I0308 00:34:00.652556 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548834-njxhh"] Mar 08 00:34:00 crc kubenswrapper[4713]: W0308 00:34:00.653488 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef90820d_fdcc_4ff1_97db_756e8c96851a.slice/crio-fd2b7ee2d0f0af78b41892d65cdb0f57c93d9a4db60f2fd5702516f644cabf6d WatchSource:0}: Error finding container fd2b7ee2d0f0af78b41892d65cdb0f57c93d9a4db60f2fd5702516f644cabf6d: Status 404 returned error can't find the container with id fd2b7ee2d0f0af78b41892d65cdb0f57c93d9a4db60f2fd5702516f644cabf6d Mar 08 00:34:00 crc kubenswrapper[4713]: I0308 00:34:00.703807 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548834-njxhh" event={"ID":"ef90820d-fdcc-4ff1-97db-756e8c96851a","Type":"ContainerStarted","Data":"fd2b7ee2d0f0af78b41892d65cdb0f57c93d9a4db60f2fd5702516f644cabf6d"} Mar 08 00:34:00 crc kubenswrapper[4713]: I0308 00:34:00.705053 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" event={"ID":"52ed2487-d016-4930-a9ec-98500bfc0db3","Type":"ContainerStarted","Data":"2ee28c2f8fc1433f9ba19f9c07ab8d85929756f524dd0e86526ecf528ab6aea3"} Mar 08 00:34:00 crc kubenswrapper[4713]: I0308 00:34:00.721011 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" podStartSLOduration=1.914970427 podStartE2EDuration="7.72099367s" podCreationTimestamp="2026-03-08 00:33:53 +0000 UTC" firstStartedPulling="2026-03-08 00:33:54.214460911 +0000 UTC m=+1688.334093144" lastFinishedPulling="2026-03-08 00:34:00.020484124 +0000 UTC m=+1694.140116387" observedRunningTime="2026-03-08 00:34:00.720030335 +0000 UTC m=+1694.839662588" watchObservedRunningTime="2026-03-08 00:34:00.72099367 +0000 UTC m=+1694.840625903" Mar 08 00:34:01 crc kubenswrapper[4713]: I0308 00:34:01.714681 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548834-njxhh" event={"ID":"ef90820d-fdcc-4ff1-97db-756e8c96851a","Type":"ContainerStarted","Data":"54d98c92ae122fbfe885e4ff1e76b36a0e389e6c7ef0c5d932a7c247396198f3"} Mar 08 00:34:01 crc kubenswrapper[4713]: I0308 00:34:01.729906 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29548834-njxhh" podStartSLOduration=0.91415066 podStartE2EDuration="1.729884406s" podCreationTimestamp="2026-03-08 00:34:00 +0000 UTC" firstStartedPulling="2026-03-08 00:34:00.657032371 +0000 UTC m=+1694.776664604" lastFinishedPulling="2026-03-08 00:34:01.472766097 +0000 UTC m=+1695.592398350" observedRunningTime="2026-03-08 00:34:01.724880454 +0000 UTC m=+1695.844512707" watchObservedRunningTime="2026-03-08 00:34:01.729884406 +0000 UTC m=+1695.849516629" Mar 08 00:34:02 crc kubenswrapper[4713]: I0308 00:34:02.727000 4713 generic.go:334] "Generic (PLEG): container finished" podID="ef90820d-fdcc-4ff1-97db-756e8c96851a" containerID="54d98c92ae122fbfe885e4ff1e76b36a0e389e6c7ef0c5d932a7c247396198f3" exitCode=0 Mar 08 00:34:02 crc kubenswrapper[4713]: I0308 00:34:02.727057 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548834-njxhh" event={"ID":"ef90820d-fdcc-4ff1-97db-756e8c96851a","Type":"ContainerDied","Data":"54d98c92ae122fbfe885e4ff1e76b36a0e389e6c7ef0c5d932a7c247396198f3"} Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.772247 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.773901 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.778222 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-session-secret" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.778345 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.778476 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-web-config" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.778584 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-1" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.778694 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"serving-certs-ca-bundle" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.778744 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-0" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.778869 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-2" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.778902 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-stf-dockercfg-78nxz" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.779003 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-prometheus-proxy-tls" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.779084 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-tls-assets-0" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.792813 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.910912 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-config\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.910956 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-web-config\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.910984 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.911003 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cf91b8a6-24ec-4c39-8337-f05acf19e199-config-out\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.911035 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf91b8a6-24ec-4c39-8337-f05acf19e199-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.911048 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cf91b8a6-24ec-4c39-8337-f05acf19e199-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.911072 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cf91b8a6-24ec-4c39-8337-f05acf19e199-tls-assets\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.911086 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/cf91b8a6-24ec-4c39-8337-f05acf19e199-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.911104 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbwbp\" (UniqueName: \"kubernetes.io/projected/cf91b8a6-24ec-4c39-8337-f05acf19e199-kube-api-access-qbwbp\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.911124 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.911148 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/cf91b8a6-24ec-4c39-8337-f05acf19e199-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.911168 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c0e12a34-f5ae-4cbb-8e85-5b0ba7390133\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c0e12a34-f5ae-4cbb-8e85-5b0ba7390133\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.997659 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548834-njxhh" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.012774 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-web-config\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.013280 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cf91b8a6-24ec-4c39-8337-f05acf19e199-config-out\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.013745 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.013804 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf91b8a6-24ec-4c39-8337-f05acf19e199-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.013851 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cf91b8a6-24ec-4c39-8337-f05acf19e199-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.013889 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/cf91b8a6-24ec-4c39-8337-f05acf19e199-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.013906 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cf91b8a6-24ec-4c39-8337-f05acf19e199-tls-assets\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.013930 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbwbp\" (UniqueName: \"kubernetes.io/projected/cf91b8a6-24ec-4c39-8337-f05acf19e199-kube-api-access-qbwbp\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.013961 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.014012 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/cf91b8a6-24ec-4c39-8337-f05acf19e199-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: E0308 00:34:04.014006 4713 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.014040 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c0e12a34-f5ae-4cbb-8e85-5b0ba7390133\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c0e12a34-f5ae-4cbb-8e85-5b0ba7390133\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.014100 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-config\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: E0308 00:34:04.014143 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-secret-default-prometheus-proxy-tls podName:cf91b8a6-24ec-4c39-8337-f05acf19e199 nodeName:}" failed. No retries permitted until 2026-03-08 00:34:04.514108128 +0000 UTC m=+1698.633740371 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "cf91b8a6-24ec-4c39-8337-f05acf19e199") : secret "default-prometheus-proxy-tls" not found Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.015417 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/cf91b8a6-24ec-4c39-8337-f05acf19e199-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.015419 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/cf91b8a6-24ec-4c39-8337-f05acf19e199-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.015439 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cf91b8a6-24ec-4c39-8337-f05acf19e199-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.016331 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf91b8a6-24ec-4c39-8337-f05acf19e199-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.020479 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.020513 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c0e12a34-f5ae-4cbb-8e85-5b0ba7390133\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c0e12a34-f5ae-4cbb-8e85-5b0ba7390133\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6f960d04c1718d4ca7632e6054426d041bf9e016104b49269b2d10d057333c68/globalmount\"" pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.021652 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.021663 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-web-config\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.021914 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-config\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.022789 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cf91b8a6-24ec-4c39-8337-f05acf19e199-tls-assets\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.024130 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cf91b8a6-24ec-4c39-8337-f05acf19e199-config-out\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.044724 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbwbp\" (UniqueName: \"kubernetes.io/projected/cf91b8a6-24ec-4c39-8337-f05acf19e199-kube-api-access-qbwbp\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.049877 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c0e12a34-f5ae-4cbb-8e85-5b0ba7390133\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c0e12a34-f5ae-4cbb-8e85-5b0ba7390133\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.115631 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9p7jc\" (UniqueName: \"kubernetes.io/projected/ef90820d-fdcc-4ff1-97db-756e8c96851a-kube-api-access-9p7jc\") pod \"ef90820d-fdcc-4ff1-97db-756e8c96851a\" (UID: \"ef90820d-fdcc-4ff1-97db-756e8c96851a\") " Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.118869 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef90820d-fdcc-4ff1-97db-756e8c96851a-kube-api-access-9p7jc" (OuterVolumeSpecName: "kube-api-access-9p7jc") pod "ef90820d-fdcc-4ff1-97db-756e8c96851a" (UID: "ef90820d-fdcc-4ff1-97db-756e8c96851a"). InnerVolumeSpecName "kube-api-access-9p7jc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.217525 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9p7jc\" (UniqueName: \"kubernetes.io/projected/ef90820d-fdcc-4ff1-97db-756e8c96851a-kube-api-access-9p7jc\") on node \"crc\" DevicePath \"\"" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.521655 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: E0308 00:34:04.521906 4713 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Mar 08 00:34:04 crc kubenswrapper[4713]: E0308 00:34:04.521963 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-secret-default-prometheus-proxy-tls podName:cf91b8a6-24ec-4c39-8337-f05acf19e199 nodeName:}" failed. No retries permitted until 2026-03-08 00:34:05.521944996 +0000 UTC m=+1699.641577239 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "cf91b8a6-24ec-4c39-8337-f05acf19e199") : secret "default-prometheus-proxy-tls" not found Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.754893 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548834-njxhh" event={"ID":"ef90820d-fdcc-4ff1-97db-756e8c96851a","Type":"ContainerDied","Data":"fd2b7ee2d0f0af78b41892d65cdb0f57c93d9a4db60f2fd5702516f644cabf6d"} Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.754934 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd2b7ee2d0f0af78b41892d65cdb0f57c93d9a4db60f2fd5702516f644cabf6d" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.755001 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548834-njxhh" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.779928 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29548828-b8fft"] Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.786661 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29548828-b8fft"] Mar 08 00:34:05 crc kubenswrapper[4713]: I0308 00:34:05.535849 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:05 crc kubenswrapper[4713]: I0308 00:34:05.543136 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:05 crc kubenswrapper[4713]: I0308 00:34:05.597052 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Mar 08 00:34:05 crc kubenswrapper[4713]: I0308 00:34:05.818705 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 08 00:34:06 crc kubenswrapper[4713]: I0308 00:34:06.548249 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91f9ab32-0c71-4b60-b499-75b2f4f4dcf3" path="/var/lib/kubelet/pods/91f9ab32-0c71-4b60-b499-75b2f4f4dcf3/volumes" Mar 08 00:34:06 crc kubenswrapper[4713]: I0308 00:34:06.776645 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"cf91b8a6-24ec-4c39-8337-f05acf19e199","Type":"ContainerStarted","Data":"afa3c68f33fcf026ae023a61d4786edb94046d5cecf6df345b66c62165522196"} Mar 08 00:34:09 crc kubenswrapper[4713]: I0308 00:34:09.797684 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"cf91b8a6-24ec-4c39-8337-f05acf19e199","Type":"ContainerStarted","Data":"f5474a515132f9dfb600e5576fc25401132b27f36d91cababcdd4e20fbe4260a"} Mar 08 00:34:13 crc kubenswrapper[4713]: I0308 00:34:13.653163 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-lfj62"] Mar 08 00:34:13 crc kubenswrapper[4713]: E0308 00:34:13.653431 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef90820d-fdcc-4ff1-97db-756e8c96851a" containerName="oc" Mar 08 00:34:13 crc kubenswrapper[4713]: I0308 00:34:13.653443 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef90820d-fdcc-4ff1-97db-756e8c96851a" containerName="oc" Mar 08 00:34:13 crc kubenswrapper[4713]: I0308 00:34:13.653545 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef90820d-fdcc-4ff1-97db-756e8c96851a" containerName="oc" Mar 08 00:34:13 crc kubenswrapper[4713]: I0308 00:34:13.653969 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-lfj62" Mar 08 00:34:13 crc kubenswrapper[4713]: I0308 00:34:13.668740 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-lfj62"] Mar 08 00:34:13 crc kubenswrapper[4713]: I0308 00:34:13.758228 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6gh5\" (UniqueName: \"kubernetes.io/projected/6bdaeb5b-32b1-4454-9a68-0893de41cc75-kube-api-access-b6gh5\") pod \"default-snmp-webhook-6856cfb745-lfj62\" (UID: \"6bdaeb5b-32b1-4454-9a68-0893de41cc75\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-lfj62" Mar 08 00:34:13 crc kubenswrapper[4713]: I0308 00:34:13.858934 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6gh5\" (UniqueName: \"kubernetes.io/projected/6bdaeb5b-32b1-4454-9a68-0893de41cc75-kube-api-access-b6gh5\") pod \"default-snmp-webhook-6856cfb745-lfj62\" (UID: \"6bdaeb5b-32b1-4454-9a68-0893de41cc75\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-lfj62" Mar 08 00:34:13 crc kubenswrapper[4713]: I0308 00:34:13.882335 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6gh5\" (UniqueName: \"kubernetes.io/projected/6bdaeb5b-32b1-4454-9a68-0893de41cc75-kube-api-access-b6gh5\") pod \"default-snmp-webhook-6856cfb745-lfj62\" (UID: \"6bdaeb5b-32b1-4454-9a68-0893de41cc75\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-lfj62" Mar 08 00:34:13 crc kubenswrapper[4713]: I0308 00:34:13.981420 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-lfj62" Mar 08 00:34:14 crc kubenswrapper[4713]: I0308 00:34:14.201875 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-lfj62"] Mar 08 00:34:14 crc kubenswrapper[4713]: I0308 00:34:14.828926 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-lfj62" event={"ID":"6bdaeb5b-32b1-4454-9a68-0893de41cc75","Type":"ContainerStarted","Data":"321dfb024e3ca1c0e5b1d095dd0dfe4e9ba64d4c64cbaa49ee57bd89064c6e6f"} Mar 08 00:34:15 crc kubenswrapper[4713]: I0308 00:34:15.540503 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:34:15 crc kubenswrapper[4713]: E0308 00:34:15.541004 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:34:16 crc kubenswrapper[4713]: I0308 00:34:16.846410 4713 generic.go:334] "Generic (PLEG): container finished" podID="cf91b8a6-24ec-4c39-8337-f05acf19e199" containerID="f5474a515132f9dfb600e5576fc25401132b27f36d91cababcdd4e20fbe4260a" exitCode=0 Mar 08 00:34:16 crc kubenswrapper[4713]: I0308 00:34:16.846453 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"cf91b8a6-24ec-4c39-8337-f05acf19e199","Type":"ContainerDied","Data":"f5474a515132f9dfb600e5576fc25401132b27f36d91cababcdd4e20fbe4260a"} Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.527733 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.529445 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.531494 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-alertmanager-proxy-tls" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.531697 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-tls-assets-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.532058 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-stf-dockercfg-stbp8" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.532137 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-cluster-tls-config" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.534400 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-generated" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.535242 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-web-config" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.549080 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.612716 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.612783 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-46a49f59-719b-4120-bf1b-b46ee54fb347\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46a49f59-719b-4120-bf1b-b46ee54fb347\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.612818 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-web-config\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.612868 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/76d6e5d8-8303-43ac-a477-0dfe579adad2-tls-assets\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.612896 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjf2g\" (UniqueName: \"kubernetes.io/projected/76d6e5d8-8303-43ac-a477-0dfe579adad2-kube-api-access-rjf2g\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.612933 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-config-volume\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.612964 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/76d6e5d8-8303-43ac-a477-0dfe579adad2-config-out\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.612981 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.613012 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.714519 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-46a49f59-719b-4120-bf1b-b46ee54fb347\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46a49f59-719b-4120-bf1b-b46ee54fb347\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.714589 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-web-config\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.714620 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/76d6e5d8-8303-43ac-a477-0dfe579adad2-tls-assets\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.714643 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjf2g\" (UniqueName: \"kubernetes.io/projected/76d6e5d8-8303-43ac-a477-0dfe579adad2-kube-api-access-rjf2g\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.714663 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-config-volume\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.714695 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/76d6e5d8-8303-43ac-a477-0dfe579adad2-config-out\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.714711 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.714728 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.714752 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: E0308 00:34:17.714881 4713 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 08 00:34:17 crc kubenswrapper[4713]: E0308 00:34:17.714937 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-secret-default-alertmanager-proxy-tls podName:76d6e5d8-8303-43ac-a477-0dfe579adad2 nodeName:}" failed. No retries permitted until 2026-03-08 00:34:18.214914271 +0000 UTC m=+1712.334546514 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "76d6e5d8-8303-43ac-a477-0dfe579adad2") : secret "default-alertmanager-proxy-tls" not found Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.720961 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.721325 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-web-config\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.722001 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-config-volume\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.722645 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.723504 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.723531 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-46a49f59-719b-4120-bf1b-b46ee54fb347\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46a49f59-719b-4120-bf1b-b46ee54fb347\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8166928179f9697bb27271c5054606ad15f49cf71086ec4487477abe8fe5c88e/globalmount\"" pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.724211 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/76d6e5d8-8303-43ac-a477-0dfe579adad2-tls-assets\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.736068 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/76d6e5d8-8303-43ac-a477-0dfe579adad2-config-out\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.740770 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjf2g\" (UniqueName: \"kubernetes.io/projected/76d6e5d8-8303-43ac-a477-0dfe579adad2-kube-api-access-rjf2g\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.761246 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-46a49f59-719b-4120-bf1b-b46ee54fb347\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46a49f59-719b-4120-bf1b-b46ee54fb347\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:18 crc kubenswrapper[4713]: I0308 00:34:18.224682 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:18 crc kubenswrapper[4713]: E0308 00:34:18.224961 4713 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 08 00:34:18 crc kubenswrapper[4713]: E0308 00:34:18.225017 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-secret-default-alertmanager-proxy-tls podName:76d6e5d8-8303-43ac-a477-0dfe579adad2 nodeName:}" failed. No retries permitted until 2026-03-08 00:34:19.22499791 +0000 UTC m=+1713.344630143 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "76d6e5d8-8303-43ac-a477-0dfe579adad2") : secret "default-alertmanager-proxy-tls" not found Mar 08 00:34:19 crc kubenswrapper[4713]: I0308 00:34:19.240879 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:19 crc kubenswrapper[4713]: E0308 00:34:19.241087 4713 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 08 00:34:19 crc kubenswrapper[4713]: E0308 00:34:19.241387 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-secret-default-alertmanager-proxy-tls podName:76d6e5d8-8303-43ac-a477-0dfe579adad2 nodeName:}" failed. No retries permitted until 2026-03-08 00:34:21.241365905 +0000 UTC m=+1715.360998138 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "76d6e5d8-8303-43ac-a477-0dfe579adad2") : secret "default-alertmanager-proxy-tls" not found Mar 08 00:34:21 crc kubenswrapper[4713]: I0308 00:34:21.275408 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:21 crc kubenswrapper[4713]: I0308 00:34:21.289638 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:21 crc kubenswrapper[4713]: I0308 00:34:21.450919 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:21 crc kubenswrapper[4713]: I0308 00:34:21.566648 4713 scope.go:117] "RemoveContainer" containerID="ef6200b05d87f80e3b68b8cd3aa4e78082a7e3103ea753de97cc7213a72cdd71" Mar 08 00:34:23 crc kubenswrapper[4713]: I0308 00:34:23.192223 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 08 00:34:23 crc kubenswrapper[4713]: I0308 00:34:23.900835 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-lfj62" event={"ID":"6bdaeb5b-32b1-4454-9a68-0893de41cc75","Type":"ContainerStarted","Data":"ab0d8dd635c9519b06a29c0febcdbe31ec56160d22dd909035519284a196f3f3"} Mar 08 00:34:23 crc kubenswrapper[4713]: I0308 00:34:23.903061 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"76d6e5d8-8303-43ac-a477-0dfe579adad2","Type":"ContainerStarted","Data":"3eefc93efd42b43d62250985baecbbf57dbaf8879dbc4ec699d874d8bebd51e3"} Mar 08 00:34:23 crc kubenswrapper[4713]: I0308 00:34:23.925652 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-snmp-webhook-6856cfb745-lfj62" podStartSLOduration=2.316121308 podStartE2EDuration="10.925629813s" podCreationTimestamp="2026-03-08 00:34:13 +0000 UTC" firstStartedPulling="2026-03-08 00:34:14.21798119 +0000 UTC m=+1708.337613423" lastFinishedPulling="2026-03-08 00:34:22.827489695 +0000 UTC m=+1716.947121928" observedRunningTime="2026-03-08 00:34:23.9145975 +0000 UTC m=+1718.034229743" watchObservedRunningTime="2026-03-08 00:34:23.925629813 +0000 UTC m=+1718.045262046" Mar 08 00:34:25 crc kubenswrapper[4713]: I0308 00:34:25.917839 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"76d6e5d8-8303-43ac-a477-0dfe579adad2","Type":"ContainerStarted","Data":"e57a6864734bb9e4583b73682f563411af559e6e88938f4da33f38a2c14b661b"} Mar 08 00:34:26 crc kubenswrapper[4713]: I0308 00:34:26.548073 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:34:26 crc kubenswrapper[4713]: E0308 00:34:26.548297 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:34:27 crc kubenswrapper[4713]: I0308 00:34:27.935904 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"cf91b8a6-24ec-4c39-8337-f05acf19e199","Type":"ContainerStarted","Data":"bc4fb448f721b6bb976cb2e1f49345a27cd1c296353402161de108ed025f0716"} Mar 08 00:34:28 crc kubenswrapper[4713]: I0308 00:34:28.944224 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"cf91b8a6-24ec-4c39-8337-f05acf19e199","Type":"ContainerStarted","Data":"3ce63770185f927d536050fcdf86cad8cc018a110fb681959f2e74ddef692d8e"} Mar 08 00:34:30 crc kubenswrapper[4713]: I0308 00:34:30.934084 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq"] Mar 08 00:34:30 crc kubenswrapper[4713]: I0308 00:34:30.941187 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:30 crc kubenswrapper[4713]: I0308 00:34:30.942626 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-dockercfg-tzd69" Mar 08 00:34:30 crc kubenswrapper[4713]: I0308 00:34:30.942892 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-session-secret" Mar 08 00:34:30 crc kubenswrapper[4713]: I0308 00:34:30.943892 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-coll-meter-proxy-tls" Mar 08 00:34:30 crc kubenswrapper[4713]: I0308 00:34:30.944197 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-meter-sg-core-configmap" Mar 08 00:34:30 crc kubenswrapper[4713]: I0308 00:34:30.944588 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq"] Mar 08 00:34:31 crc kubenswrapper[4713]: I0308 00:34:31.007631 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wmgv\" (UniqueName: \"kubernetes.io/projected/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-kube-api-access-6wmgv\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:31 crc kubenswrapper[4713]: I0308 00:34:31.007682 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:31 crc kubenswrapper[4713]: I0308 00:34:31.007717 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:31 crc kubenswrapper[4713]: I0308 00:34:31.007745 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:31 crc kubenswrapper[4713]: I0308 00:34:31.007777 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:31 crc kubenswrapper[4713]: I0308 00:34:31.108883 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:31 crc kubenswrapper[4713]: I0308 00:34:31.109004 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wmgv\" (UniqueName: \"kubernetes.io/projected/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-kube-api-access-6wmgv\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:31 crc kubenswrapper[4713]: I0308 00:34:31.109041 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:31 crc kubenswrapper[4713]: I0308 00:34:31.109083 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:31 crc kubenswrapper[4713]: I0308 00:34:31.109116 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:31 crc kubenswrapper[4713]: E0308 00:34:31.109370 4713 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Mar 08 00:34:31 crc kubenswrapper[4713]: E0308 00:34:31.109453 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-default-cloud1-coll-meter-proxy-tls podName:7aaf11cd-f1cf-42c7-9fe9-52880e0af19c nodeName:}" failed. No retries permitted until 2026-03-08 00:34:31.60943345 +0000 UTC m=+1725.729065683 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" (UID: "7aaf11cd-f1cf-42c7-9fe9-52880e0af19c") : secret "default-cloud1-coll-meter-proxy-tls" not found Mar 08 00:34:31 crc kubenswrapper[4713]: I0308 00:34:31.109480 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:31 crc kubenswrapper[4713]: I0308 00:34:31.109948 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:31 crc kubenswrapper[4713]: I0308 00:34:31.120600 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:31 crc kubenswrapper[4713]: I0308 00:34:31.127637 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wmgv\" (UniqueName: \"kubernetes.io/projected/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-kube-api-access-6wmgv\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:31 crc kubenswrapper[4713]: I0308 00:34:31.618478 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:31 crc kubenswrapper[4713]: E0308 00:34:31.618658 4713 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Mar 08 00:34:31 crc kubenswrapper[4713]: E0308 00:34:31.618715 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-default-cloud1-coll-meter-proxy-tls podName:7aaf11cd-f1cf-42c7-9fe9-52880e0af19c nodeName:}" failed. No retries permitted until 2026-03-08 00:34:32.618698147 +0000 UTC m=+1726.738330380 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" (UID: "7aaf11cd-f1cf-42c7-9fe9-52880e0af19c") : secret "default-cloud1-coll-meter-proxy-tls" not found Mar 08 00:34:32 crc kubenswrapper[4713]: I0308 00:34:32.633421 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:32 crc kubenswrapper[4713]: I0308 00:34:32.639060 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:32 crc kubenswrapper[4713]: I0308 00:34:32.767779 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.001352 4713 generic.go:334] "Generic (PLEG): container finished" podID="76d6e5d8-8303-43ac-a477-0dfe579adad2" containerID="e57a6864734bb9e4583b73682f563411af559e6e88938f4da33f38a2c14b661b" exitCode=0 Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.001974 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"76d6e5d8-8303-43ac-a477-0dfe579adad2","Type":"ContainerDied","Data":"e57a6864734bb9e4583b73682f563411af559e6e88938f4da33f38a2c14b661b"} Mar 08 00:34:33 crc kubenswrapper[4713]: W0308 00:34:33.096155 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7aaf11cd_f1cf_42c7_9fe9_52880e0af19c.slice/crio-d0bd4e8ad16493c36a6f58730fea2ea22c194f793f33b51c5aed9d02256dbdb2 WatchSource:0}: Error finding container d0bd4e8ad16493c36a6f58730fea2ea22c194f793f33b51c5aed9d02256dbdb2: Status 404 returned error can't find the container with id d0bd4e8ad16493c36a6f58730fea2ea22c194f793f33b51c5aed9d02256dbdb2 Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.103340 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq"] Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.706495 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg"] Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.707754 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.710317 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-ceil-meter-proxy-tls" Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.710331 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-meter-sg-core-configmap" Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.719819 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg"] Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.756611 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/367439a6-a382-49f1-b0af-cf399b5a6401-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.756700 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/367439a6-a382-49f1-b0af-cf399b5a6401-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.756748 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4p4g\" (UniqueName: \"kubernetes.io/projected/367439a6-a382-49f1-b0af-cf399b5a6401-kube-api-access-m4p4g\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.756784 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/367439a6-a382-49f1-b0af-cf399b5a6401-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.756847 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/367439a6-a382-49f1-b0af-cf399b5a6401-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.859994 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/367439a6-a382-49f1-b0af-cf399b5a6401-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.860073 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/367439a6-a382-49f1-b0af-cf399b5a6401-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.860103 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4p4g\" (UniqueName: \"kubernetes.io/projected/367439a6-a382-49f1-b0af-cf399b5a6401-kube-api-access-m4p4g\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.860129 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/367439a6-a382-49f1-b0af-cf399b5a6401-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.860160 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/367439a6-a382-49f1-b0af-cf399b5a6401-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:33 crc kubenswrapper[4713]: E0308 00:34:33.860280 4713 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 08 00:34:33 crc kubenswrapper[4713]: E0308 00:34:33.860332 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/367439a6-a382-49f1-b0af-cf399b5a6401-default-cloud1-ceil-meter-proxy-tls podName:367439a6-a382-49f1-b0af-cf399b5a6401 nodeName:}" failed. No retries permitted until 2026-03-08 00:34:34.360316205 +0000 UTC m=+1728.479948438 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/367439a6-a382-49f1-b0af-cf399b5a6401-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" (UID: "367439a6-a382-49f1-b0af-cf399b5a6401") : secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.863746 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/367439a6-a382-49f1-b0af-cf399b5a6401-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.865384 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/367439a6-a382-49f1-b0af-cf399b5a6401-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.867780 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/367439a6-a382-49f1-b0af-cf399b5a6401-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.889617 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4p4g\" (UniqueName: \"kubernetes.io/projected/367439a6-a382-49f1-b0af-cf399b5a6401-kube-api-access-m4p4g\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:34 crc kubenswrapper[4713]: I0308 00:34:34.012316 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" event={"ID":"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c","Type":"ContainerStarted","Data":"d0bd4e8ad16493c36a6f58730fea2ea22c194f793f33b51c5aed9d02256dbdb2"} Mar 08 00:34:34 crc kubenswrapper[4713]: I0308 00:34:34.365458 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/367439a6-a382-49f1-b0af-cf399b5a6401-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:34 crc kubenswrapper[4713]: E0308 00:34:34.365614 4713 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 08 00:34:34 crc kubenswrapper[4713]: E0308 00:34:34.365662 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/367439a6-a382-49f1-b0af-cf399b5a6401-default-cloud1-ceil-meter-proxy-tls podName:367439a6-a382-49f1-b0af-cf399b5a6401 nodeName:}" failed. No retries permitted until 2026-03-08 00:34:35.365648638 +0000 UTC m=+1729.485280871 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/367439a6-a382-49f1-b0af-cf399b5a6401-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" (UID: "367439a6-a382-49f1-b0af-cf399b5a6401") : secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 08 00:34:35 crc kubenswrapper[4713]: I0308 00:34:35.380268 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/367439a6-a382-49f1-b0af-cf399b5a6401-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:35 crc kubenswrapper[4713]: I0308 00:34:35.391554 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/367439a6-a382-49f1-b0af-cf399b5a6401-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:35 crc kubenswrapper[4713]: I0308 00:34:35.525342 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:38 crc kubenswrapper[4713]: I0308 00:34:38.547024 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:34:38 crc kubenswrapper[4713]: E0308 00:34:38.547731 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.073257 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl"] Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.075370 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.084297 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-sens-meter-proxy-tls" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.084328 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-sens-meter-sg-core-configmap" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.129382 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl"] Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.156718 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/fff80c8a-de9a-483b-8be3-5ce1423649cb-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.156802 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/fff80c8a-de9a-483b-8be3-5ce1423649cb-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.156868 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/fff80c8a-de9a-483b-8be3-5ce1423649cb-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.156933 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/fff80c8a-de9a-483b-8be3-5ce1423649cb-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.156966 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc5vw\" (UniqueName: \"kubernetes.io/projected/fff80c8a-de9a-483b-8be3-5ce1423649cb-kube-api-access-zc5vw\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.258276 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/fff80c8a-de9a-483b-8be3-5ce1423649cb-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.258382 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/fff80c8a-de9a-483b-8be3-5ce1423649cb-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.258427 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/fff80c8a-de9a-483b-8be3-5ce1423649cb-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.258499 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/fff80c8a-de9a-483b-8be3-5ce1423649cb-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.258529 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc5vw\" (UniqueName: \"kubernetes.io/projected/fff80c8a-de9a-483b-8be3-5ce1423649cb-kube-api-access-zc5vw\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:39 crc kubenswrapper[4713]: E0308 00:34:39.258685 4713 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Mar 08 00:34:39 crc kubenswrapper[4713]: E0308 00:34:39.258785 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fff80c8a-de9a-483b-8be3-5ce1423649cb-default-cloud1-sens-meter-proxy-tls podName:fff80c8a-de9a-483b-8be3-5ce1423649cb nodeName:}" failed. No retries permitted until 2026-03-08 00:34:39.758764933 +0000 UTC m=+1733.878397166 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/fff80c8a-de9a-483b-8be3-5ce1423649cb-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" (UID: "fff80c8a-de9a-483b-8be3-5ce1423649cb") : secret "default-cloud1-sens-meter-proxy-tls" not found Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.258783 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/fff80c8a-de9a-483b-8be3-5ce1423649cb-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.259632 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/fff80c8a-de9a-483b-8be3-5ce1423649cb-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.272209 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/fff80c8a-de9a-483b-8be3-5ce1423649cb-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.296176 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc5vw\" (UniqueName: \"kubernetes.io/projected/fff80c8a-de9a-483b-8be3-5ce1423649cb-kube-api-access-zc5vw\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.764774 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/fff80c8a-de9a-483b-8be3-5ce1423649cb-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:39 crc kubenswrapper[4713]: E0308 00:34:39.765046 4713 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Mar 08 00:34:39 crc kubenswrapper[4713]: E0308 00:34:39.765228 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fff80c8a-de9a-483b-8be3-5ce1423649cb-default-cloud1-sens-meter-proxy-tls podName:fff80c8a-de9a-483b-8be3-5ce1423649cb nodeName:}" failed. No retries permitted until 2026-03-08 00:34:40.765209105 +0000 UTC m=+1734.884841338 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/fff80c8a-de9a-483b-8be3-5ce1423649cb-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" (UID: "fff80c8a-de9a-483b-8be3-5ce1423649cb") : secret "default-cloud1-sens-meter-proxy-tls" not found Mar 08 00:34:40 crc kubenswrapper[4713]: I0308 00:34:40.733596 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg"] Mar 08 00:34:40 crc kubenswrapper[4713]: I0308 00:34:40.783315 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/fff80c8a-de9a-483b-8be3-5ce1423649cb-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:40 crc kubenswrapper[4713]: I0308 00:34:40.791725 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/fff80c8a-de9a-483b-8be3-5ce1423649cb-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:40 crc kubenswrapper[4713]: W0308 00:34:40.883081 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod367439a6_a382_49f1_b0af_cf399b5a6401.slice/crio-db4af417f078b5f9580d5d86c051ae807c8a1d53ef329af65c71d58c5921c5cc WatchSource:0}: Error finding container db4af417f078b5f9580d5d86c051ae807c8a1d53ef329af65c71d58c5921c5cc: Status 404 returned error can't find the container with id db4af417f078b5f9580d5d86c051ae807c8a1d53ef329af65c71d58c5921c5cc Mar 08 00:34:40 crc kubenswrapper[4713]: I0308 00:34:40.902138 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:41 crc kubenswrapper[4713]: I0308 00:34:41.113996 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" event={"ID":"367439a6-a382-49f1-b0af-cf399b5a6401","Type":"ContainerStarted","Data":"db4af417f078b5f9580d5d86c051ae807c8a1d53ef329af65c71d58c5921c5cc"} Mar 08 00:34:41 crc kubenswrapper[4713]: I0308 00:34:41.146639 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"cf91b8a6-24ec-4c39-8337-f05acf19e199","Type":"ContainerStarted","Data":"de59d2f03d3d2f84d9171aac8cf777a73135b57a458d2b531e6aaed4c253de19"} Mar 08 00:34:41 crc kubenswrapper[4713]: I0308 00:34:41.180351 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-default-0" podStartSLOduration=4.746775209 podStartE2EDuration="39.180327321s" podCreationTimestamp="2026-03-08 00:34:02 +0000 UTC" firstStartedPulling="2026-03-08 00:34:05.822537241 +0000 UTC m=+1699.942169484" lastFinishedPulling="2026-03-08 00:34:40.256089363 +0000 UTC m=+1734.375721596" observedRunningTime="2026-03-08 00:34:41.174753933 +0000 UTC m=+1735.294386176" watchObservedRunningTime="2026-03-08 00:34:41.180327321 +0000 UTC m=+1735.299959564" Mar 08 00:34:41 crc kubenswrapper[4713]: I0308 00:34:41.369170 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl"] Mar 08 00:34:42 crc kubenswrapper[4713]: I0308 00:34:42.153597 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" event={"ID":"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c","Type":"ContainerStarted","Data":"21359b62803c17db8a61e255ac740d8bb95576dae94b515912debfa309c1e4b3"} Mar 08 00:34:42 crc kubenswrapper[4713]: I0308 00:34:42.156206 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" event={"ID":"fff80c8a-de9a-483b-8be3-5ce1423649cb","Type":"ContainerStarted","Data":"6516bdef4f9306692f82cf58bd85b7ff26eccea5c9321e0980e559bd7036b868"} Mar 08 00:34:45 crc kubenswrapper[4713]: I0308 00:34:45.597794 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/prometheus-default-0" Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.567736 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4"] Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.569391 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.573184 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-cert" Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.573185 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-event-sg-core-configmap" Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.577314 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4"] Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.671899 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf7l6\" (UniqueName: \"kubernetes.io/projected/dc460969-e1ae-4bac-8893-7677ac74787b-kube-api-access-cf7l6\") pod \"default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4\" (UID: \"dc460969-e1ae-4bac-8893-7677ac74787b\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.671973 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/dc460969-e1ae-4bac-8893-7677ac74787b-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4\" (UID: \"dc460969-e1ae-4bac-8893-7677ac74787b\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.672121 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/dc460969-e1ae-4bac-8893-7677ac74787b-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4\" (UID: \"dc460969-e1ae-4bac-8893-7677ac74787b\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.672214 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/dc460969-e1ae-4bac-8893-7677ac74787b-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4\" (UID: \"dc460969-e1ae-4bac-8893-7677ac74787b\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.773879 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf7l6\" (UniqueName: \"kubernetes.io/projected/dc460969-e1ae-4bac-8893-7677ac74787b-kube-api-access-cf7l6\") pod \"default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4\" (UID: \"dc460969-e1ae-4bac-8893-7677ac74787b\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.773954 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/dc460969-e1ae-4bac-8893-7677ac74787b-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4\" (UID: \"dc460969-e1ae-4bac-8893-7677ac74787b\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.774028 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/dc460969-e1ae-4bac-8893-7677ac74787b-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4\" (UID: \"dc460969-e1ae-4bac-8893-7677ac74787b\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.774069 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/dc460969-e1ae-4bac-8893-7677ac74787b-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4\" (UID: \"dc460969-e1ae-4bac-8893-7677ac74787b\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.775097 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/dc460969-e1ae-4bac-8893-7677ac74787b-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4\" (UID: \"dc460969-e1ae-4bac-8893-7677ac74787b\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.775450 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/dc460969-e1ae-4bac-8893-7677ac74787b-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4\" (UID: \"dc460969-e1ae-4bac-8893-7677ac74787b\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.786549 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/dc460969-e1ae-4bac-8893-7677ac74787b-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4\" (UID: \"dc460969-e1ae-4bac-8893-7677ac74787b\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.796599 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf7l6\" (UniqueName: \"kubernetes.io/projected/dc460969-e1ae-4bac-8893-7677ac74787b-kube-api-access-cf7l6\") pod \"default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4\" (UID: \"dc460969-e1ae-4bac-8893-7677ac74787b\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.921027 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" Mar 08 00:34:48 crc kubenswrapper[4713]: I0308 00:34:48.676079 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4"] Mar 08 00:34:48 crc kubenswrapper[4713]: W0308 00:34:48.708604 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc460969_e1ae_4bac_8893_7677ac74787b.slice/crio-9cd0fe6362f15705a632e67df51326523ce64f9f1e230408165c870d42639fac WatchSource:0}: Error finding container 9cd0fe6362f15705a632e67df51326523ce64f9f1e230408165c870d42639fac: Status 404 returned error can't find the container with id 9cd0fe6362f15705a632e67df51326523ce64f9f1e230408165c870d42639fac Mar 08 00:34:49 crc kubenswrapper[4713]: I0308 00:34:49.209486 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"76d6e5d8-8303-43ac-a477-0dfe579adad2","Type":"ContainerStarted","Data":"49fd9efa0d17e1b0a31476983cd621a2c4da29c35a74e8a29e32b9d478b98ff7"} Mar 08 00:34:49 crc kubenswrapper[4713]: I0308 00:34:49.212341 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" event={"ID":"fff80c8a-de9a-483b-8be3-5ce1423649cb","Type":"ContainerStarted","Data":"58a27c65a5ae34c4e07d3676ebb3c90914314f1fcea054bbe537214aa2b27e54"} Mar 08 00:34:49 crc kubenswrapper[4713]: I0308 00:34:49.212384 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" event={"ID":"fff80c8a-de9a-483b-8be3-5ce1423649cb","Type":"ContainerStarted","Data":"fe2077d4048ef8a7f48155c90a44e800aaf420d311535ee1a6f2b4538a01da6e"} Mar 08 00:34:49 crc kubenswrapper[4713]: I0308 00:34:49.214845 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" event={"ID":"367439a6-a382-49f1-b0af-cf399b5a6401","Type":"ContainerStarted","Data":"70e8c69b8363d7dda6445bab94851d9634cebc6b36fa398befdc00186319c707"} Mar 08 00:34:49 crc kubenswrapper[4713]: I0308 00:34:49.214879 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" event={"ID":"367439a6-a382-49f1-b0af-cf399b5a6401","Type":"ContainerStarted","Data":"4ea733096fe695d66bbcbe57f75287225c05a01b72fa0f3bd5f0165fa4a545ef"} Mar 08 00:34:49 crc kubenswrapper[4713]: I0308 00:34:49.221593 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" event={"ID":"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c","Type":"ContainerStarted","Data":"b2f3a5a9db7bcda7e3be2eea7306d4663f2317fbad21fd29ba1b163bf6d167cd"} Mar 08 00:34:49 crc kubenswrapper[4713]: I0308 00:34:49.224754 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" event={"ID":"dc460969-e1ae-4bac-8893-7677ac74787b","Type":"ContainerStarted","Data":"821c5b1c62c2dcde14a74894cdc9009068a9627d2a8c835bc11af48ec9ec9fa1"} Mar 08 00:34:49 crc kubenswrapper[4713]: I0308 00:34:49.224779 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" event={"ID":"dc460969-e1ae-4bac-8893-7677ac74787b","Type":"ContainerStarted","Data":"9cd0fe6362f15705a632e67df51326523ce64f9f1e230408165c870d42639fac"} Mar 08 00:34:50 crc kubenswrapper[4713]: I0308 00:34:50.597562 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/prometheus-default-0" Mar 08 00:34:50 crc kubenswrapper[4713]: I0308 00:34:50.665718 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/prometheus-default-0" Mar 08 00:34:51 crc kubenswrapper[4713]: I0308 00:34:51.310753 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/prometheus-default-0" Mar 08 00:34:52 crc kubenswrapper[4713]: I0308 00:34:52.255224 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"76d6e5d8-8303-43ac-a477-0dfe579adad2","Type":"ContainerStarted","Data":"8a2a5280aad7b8979a719e3c7610932d4cd0af00930dbb8a758731dd94d5aa76"} Mar 08 00:34:52 crc kubenswrapper[4713]: I0308 00:34:52.852653 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5"] Mar 08 00:34:52 crc kubenswrapper[4713]: I0308 00:34:52.853893 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" Mar 08 00:34:52 crc kubenswrapper[4713]: I0308 00:34:52.856008 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-event-sg-core-configmap" Mar 08 00:34:52 crc kubenswrapper[4713]: I0308 00:34:52.869161 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5"] Mar 08 00:34:52 crc kubenswrapper[4713]: I0308 00:34:52.971727 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/a441502e-5d0a-4ec6-ac3c-df20f292efc8-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5\" (UID: \"a441502e-5d0a-4ec6-ac3c-df20f292efc8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" Mar 08 00:34:52 crc kubenswrapper[4713]: I0308 00:34:52.972110 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/a441502e-5d0a-4ec6-ac3c-df20f292efc8-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5\" (UID: \"a441502e-5d0a-4ec6-ac3c-df20f292efc8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" Mar 08 00:34:52 crc kubenswrapper[4713]: I0308 00:34:52.972215 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgpc7\" (UniqueName: \"kubernetes.io/projected/a441502e-5d0a-4ec6-ac3c-df20f292efc8-kube-api-access-xgpc7\") pod \"default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5\" (UID: \"a441502e-5d0a-4ec6-ac3c-df20f292efc8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" Mar 08 00:34:52 crc kubenswrapper[4713]: I0308 00:34:52.972275 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/a441502e-5d0a-4ec6-ac3c-df20f292efc8-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5\" (UID: \"a441502e-5d0a-4ec6-ac3c-df20f292efc8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" Mar 08 00:34:53 crc kubenswrapper[4713]: I0308 00:34:53.074036 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/a441502e-5d0a-4ec6-ac3c-df20f292efc8-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5\" (UID: \"a441502e-5d0a-4ec6-ac3c-df20f292efc8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" Mar 08 00:34:53 crc kubenswrapper[4713]: I0308 00:34:53.074079 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/a441502e-5d0a-4ec6-ac3c-df20f292efc8-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5\" (UID: \"a441502e-5d0a-4ec6-ac3c-df20f292efc8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" Mar 08 00:34:53 crc kubenswrapper[4713]: I0308 00:34:53.074154 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgpc7\" (UniqueName: \"kubernetes.io/projected/a441502e-5d0a-4ec6-ac3c-df20f292efc8-kube-api-access-xgpc7\") pod \"default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5\" (UID: \"a441502e-5d0a-4ec6-ac3c-df20f292efc8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" Mar 08 00:34:53 crc kubenswrapper[4713]: I0308 00:34:53.074195 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/a441502e-5d0a-4ec6-ac3c-df20f292efc8-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5\" (UID: \"a441502e-5d0a-4ec6-ac3c-df20f292efc8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" Mar 08 00:34:53 crc kubenswrapper[4713]: I0308 00:34:53.075113 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/a441502e-5d0a-4ec6-ac3c-df20f292efc8-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5\" (UID: \"a441502e-5d0a-4ec6-ac3c-df20f292efc8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" Mar 08 00:34:53 crc kubenswrapper[4713]: I0308 00:34:53.075125 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/a441502e-5d0a-4ec6-ac3c-df20f292efc8-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5\" (UID: \"a441502e-5d0a-4ec6-ac3c-df20f292efc8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" Mar 08 00:34:53 crc kubenswrapper[4713]: I0308 00:34:53.080162 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/a441502e-5d0a-4ec6-ac3c-df20f292efc8-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5\" (UID: \"a441502e-5d0a-4ec6-ac3c-df20f292efc8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" Mar 08 00:34:53 crc kubenswrapper[4713]: I0308 00:34:53.097116 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgpc7\" (UniqueName: \"kubernetes.io/projected/a441502e-5d0a-4ec6-ac3c-df20f292efc8-kube-api-access-xgpc7\") pod \"default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5\" (UID: \"a441502e-5d0a-4ec6-ac3c-df20f292efc8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" Mar 08 00:34:53 crc kubenswrapper[4713]: I0308 00:34:53.176471 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" Mar 08 00:34:53 crc kubenswrapper[4713]: I0308 00:34:53.541391 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:34:53 crc kubenswrapper[4713]: E0308 00:34:53.541859 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:34:53 crc kubenswrapper[4713]: I0308 00:34:53.591070 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5"] Mar 08 00:34:55 crc kubenswrapper[4713]: I0308 00:34:55.302755 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" event={"ID":"a441502e-5d0a-4ec6-ac3c-df20f292efc8","Type":"ContainerStarted","Data":"80f4b4014c4dbe1fb437f8e190a30553b02ac77aad1b5eb6b6bee29d7230bb50"} Mar 08 00:34:56 crc kubenswrapper[4713]: I0308 00:34:56.309880 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" event={"ID":"a441502e-5d0a-4ec6-ac3c-df20f292efc8","Type":"ContainerStarted","Data":"6b95db28ad5a98065c5e450da4389e996cc18574d525c6fb99d295022c4eb159"} Mar 08 00:34:56 crc kubenswrapper[4713]: I0308 00:34:56.311680 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" event={"ID":"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c","Type":"ContainerStarted","Data":"12f24fd5ea75d7fac61d291610b51da255f458f9839805c4c526a99564e4c9c0"} Mar 08 00:34:56 crc kubenswrapper[4713]: I0308 00:34:56.314910 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" event={"ID":"dc460969-e1ae-4bac-8893-7677ac74787b","Type":"ContainerStarted","Data":"7f14074c625eb16dc1fbd098d47e54dd2c4d6db611ea7361baf8ed8a511504bb"} Mar 08 00:34:56 crc kubenswrapper[4713]: I0308 00:34:56.316758 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"76d6e5d8-8303-43ac-a477-0dfe579adad2","Type":"ContainerStarted","Data":"960529310bef2e081705c184e3296095340d6a20de195788598008091f39a7be"} Mar 08 00:34:56 crc kubenswrapper[4713]: I0308 00:34:56.318704 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" event={"ID":"fff80c8a-de9a-483b-8be3-5ce1423649cb","Type":"ContainerStarted","Data":"896dda969c52affe6df70b3c89ab1673b020960693dd9ea87bff10a44743cc9c"} Mar 08 00:34:56 crc kubenswrapper[4713]: I0308 00:34:56.320151 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" event={"ID":"367439a6-a382-49f1-b0af-cf399b5a6401","Type":"ContainerStarted","Data":"36aa5f3c129ba3c2478b4da8d8b4c638f47ce70147a4454414cb0cd35e050711"} Mar 08 00:34:56 crc kubenswrapper[4713]: I0308 00:34:56.338115 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" podStartSLOduration=3.464379417 podStartE2EDuration="26.338087858s" podCreationTimestamp="2026-03-08 00:34:30 +0000 UTC" firstStartedPulling="2026-03-08 00:34:33.104934623 +0000 UTC m=+1727.224566866" lastFinishedPulling="2026-03-08 00:34:55.978643074 +0000 UTC m=+1750.098275307" observedRunningTime="2026-03-08 00:34:56.333107277 +0000 UTC m=+1750.452739520" watchObservedRunningTime="2026-03-08 00:34:56.338087858 +0000 UTC m=+1750.457720111" Mar 08 00:34:56 crc kubenswrapper[4713]: I0308 00:34:56.363645 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/alertmanager-default-0" podStartSLOduration=17.472960069 podStartE2EDuration="40.363621031s" podCreationTimestamp="2026-03-08 00:34:16 +0000 UTC" firstStartedPulling="2026-03-08 00:34:33.006505538 +0000 UTC m=+1727.126137771" lastFinishedPulling="2026-03-08 00:34:55.8971665 +0000 UTC m=+1750.016798733" observedRunningTime="2026-03-08 00:34:56.35712873 +0000 UTC m=+1750.476760993" watchObservedRunningTime="2026-03-08 00:34:56.363621031 +0000 UTC m=+1750.483253264" Mar 08 00:34:56 crc kubenswrapper[4713]: I0308 00:34:56.383518 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" podStartSLOduration=3.246997386 podStartE2EDuration="10.383492065s" podCreationTimestamp="2026-03-08 00:34:46 +0000 UTC" firstStartedPulling="2026-03-08 00:34:48.735160513 +0000 UTC m=+1742.854792746" lastFinishedPulling="2026-03-08 00:34:55.871655192 +0000 UTC m=+1749.991287425" observedRunningTime="2026-03-08 00:34:56.379979312 +0000 UTC m=+1750.499611555" watchObservedRunningTime="2026-03-08 00:34:56.383492065 +0000 UTC m=+1750.503124298" Mar 08 00:34:56 crc kubenswrapper[4713]: I0308 00:34:56.403280 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" podStartSLOduration=8.405990508 podStartE2EDuration="23.403261876s" podCreationTimestamp="2026-03-08 00:34:33 +0000 UTC" firstStartedPulling="2026-03-08 00:34:40.902287576 +0000 UTC m=+1735.021919809" lastFinishedPulling="2026-03-08 00:34:55.899558944 +0000 UTC m=+1750.019191177" observedRunningTime="2026-03-08 00:34:56.399309482 +0000 UTC m=+1750.518941735" watchObservedRunningTime="2026-03-08 00:34:56.403261876 +0000 UTC m=+1750.522894099" Mar 08 00:34:56 crc kubenswrapper[4713]: I0308 00:34:56.423248 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" podStartSLOduration=2.802087534 podStartE2EDuration="17.423222312s" podCreationTimestamp="2026-03-08 00:34:39 +0000 UTC" firstStartedPulling="2026-03-08 00:34:41.386598089 +0000 UTC m=+1735.506230332" lastFinishedPulling="2026-03-08 00:34:56.007732877 +0000 UTC m=+1750.127365110" observedRunningTime="2026-03-08 00:34:56.415214501 +0000 UTC m=+1750.534846724" watchObservedRunningTime="2026-03-08 00:34:56.423222312 +0000 UTC m=+1750.542854575" Mar 08 00:34:57 crc kubenswrapper[4713]: I0308 00:34:57.329190 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" event={"ID":"a441502e-5d0a-4ec6-ac3c-df20f292efc8","Type":"ContainerStarted","Data":"b30b3e644e5c27e518037db47b829436643bf0d2f743aeb7246cb9b7080e84f9"} Mar 08 00:34:57 crc kubenswrapper[4713]: I0308 00:34:57.352983 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" podStartSLOduration=4.819909506 podStartE2EDuration="5.352967192s" podCreationTimestamp="2026-03-08 00:34:52 +0000 UTC" firstStartedPulling="2026-03-08 00:34:55.797343059 +0000 UTC m=+1749.916975292" lastFinishedPulling="2026-03-08 00:34:56.330400745 +0000 UTC m=+1750.450032978" observedRunningTime="2026-03-08 00:34:57.34794109 +0000 UTC m=+1751.467573333" watchObservedRunningTime="2026-03-08 00:34:57.352967192 +0000 UTC m=+1751.472599425" Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.089853 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-t7lzv"] Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.090388 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" podUID="52ed2487-d016-4930-a9ec-98500bfc0db3" containerName="default-interconnect" containerID="cri-o://2ee28c2f8fc1433f9ba19f9c07ab8d85929756f524dd0e86526ecf528ab6aea3" gracePeriod=30 Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.357281 4713 generic.go:334] "Generic (PLEG): container finished" podID="52ed2487-d016-4930-a9ec-98500bfc0db3" containerID="2ee28c2f8fc1433f9ba19f9c07ab8d85929756f524dd0e86526ecf528ab6aea3" exitCode=0 Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.357577 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" event={"ID":"52ed2487-d016-4930-a9ec-98500bfc0db3","Type":"ContainerDied","Data":"2ee28c2f8fc1433f9ba19f9c07ab8d85929756f524dd0e86526ecf528ab6aea3"} Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.367633 4713 generic.go:334] "Generic (PLEG): container finished" podID="a441502e-5d0a-4ec6-ac3c-df20f292efc8" containerID="6b95db28ad5a98065c5e450da4389e996cc18574d525c6fb99d295022c4eb159" exitCode=0 Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.367675 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" event={"ID":"a441502e-5d0a-4ec6-ac3c-df20f292efc8","Type":"ContainerDied","Data":"6b95db28ad5a98065c5e450da4389e996cc18574d525c6fb99d295022c4eb159"} Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.368164 4713 scope.go:117] "RemoveContainer" containerID="6b95db28ad5a98065c5e450da4389e996cc18574d525c6fb99d295022c4eb159" Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.521110 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.604110 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/52ed2487-d016-4930-a9ec-98500bfc0db3-sasl-config\") pod \"52ed2487-d016-4930-a9ec-98500bfc0db3\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.604159 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-inter-router-credentials\") pod \"52ed2487-d016-4930-a9ec-98500bfc0db3\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.604210 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-openstack-ca\") pod \"52ed2487-d016-4930-a9ec-98500bfc0db3\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.604251 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-openstack-credentials\") pod \"52ed2487-d016-4930-a9ec-98500bfc0db3\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.604291 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkmlb\" (UniqueName: \"kubernetes.io/projected/52ed2487-d016-4930-a9ec-98500bfc0db3-kube-api-access-rkmlb\") pod \"52ed2487-d016-4930-a9ec-98500bfc0db3\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.604360 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-sasl-users\") pod \"52ed2487-d016-4930-a9ec-98500bfc0db3\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.604381 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-inter-router-ca\") pod \"52ed2487-d016-4930-a9ec-98500bfc0db3\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.606294 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52ed2487-d016-4930-a9ec-98500bfc0db3-sasl-config" (OuterVolumeSpecName: "sasl-config") pod "52ed2487-d016-4930-a9ec-98500bfc0db3" (UID: "52ed2487-d016-4930-a9ec-98500bfc0db3"). InnerVolumeSpecName "sasl-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.613578 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-sasl-users" (OuterVolumeSpecName: "sasl-users") pod "52ed2487-d016-4930-a9ec-98500bfc0db3" (UID: "52ed2487-d016-4930-a9ec-98500bfc0db3"). InnerVolumeSpecName "sasl-users". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.614341 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-inter-router-ca" (OuterVolumeSpecName: "default-interconnect-inter-router-ca") pod "52ed2487-d016-4930-a9ec-98500bfc0db3" (UID: "52ed2487-d016-4930-a9ec-98500bfc0db3"). InnerVolumeSpecName "default-interconnect-inter-router-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.616685 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-openstack-credentials" (OuterVolumeSpecName: "default-interconnect-openstack-credentials") pod "52ed2487-d016-4930-a9ec-98500bfc0db3" (UID: "52ed2487-d016-4930-a9ec-98500bfc0db3"). InnerVolumeSpecName "default-interconnect-openstack-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.618007 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52ed2487-d016-4930-a9ec-98500bfc0db3-kube-api-access-rkmlb" (OuterVolumeSpecName: "kube-api-access-rkmlb") pod "52ed2487-d016-4930-a9ec-98500bfc0db3" (UID: "52ed2487-d016-4930-a9ec-98500bfc0db3"). InnerVolumeSpecName "kube-api-access-rkmlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.624146 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-inter-router-credentials" (OuterVolumeSpecName: "default-interconnect-inter-router-credentials") pod "52ed2487-d016-4930-a9ec-98500bfc0db3" (UID: "52ed2487-d016-4930-a9ec-98500bfc0db3"). InnerVolumeSpecName "default-interconnect-inter-router-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.626154 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-openstack-ca" (OuterVolumeSpecName: "default-interconnect-openstack-ca") pod "52ed2487-d016-4930-a9ec-98500bfc0db3" (UID: "52ed2487-d016-4930-a9ec-98500bfc0db3"). InnerVolumeSpecName "default-interconnect-openstack-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.705874 4713 reconciler_common.go:293] "Volume detached for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-sasl-users\") on node \"crc\" DevicePath \"\"" Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.705903 4713 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-inter-router-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.705913 4713 reconciler_common.go:293] "Volume detached for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/52ed2487-d016-4930-a9ec-98500bfc0db3-sasl-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.705922 4713 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-inter-router-credentials\") on node \"crc\" DevicePath \"\"" Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.705933 4713 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-openstack-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.705943 4713 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-openstack-credentials\") on node \"crc\" DevicePath \"\"" Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.705953 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkmlb\" (UniqueName: \"kubernetes.io/projected/52ed2487-d016-4930-a9ec-98500bfc0db3-kube-api-access-rkmlb\") on node \"crc\" DevicePath \"\"" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.154862 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-qpwg6"] Mar 08 00:35:01 crc kubenswrapper[4713]: E0308 00:35:01.156103 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52ed2487-d016-4930-a9ec-98500bfc0db3" containerName="default-interconnect" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.156203 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="52ed2487-d016-4930-a9ec-98500bfc0db3" containerName="default-interconnect" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.156400 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="52ed2487-d016-4930-a9ec-98500bfc0db3" containerName="default-interconnect" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.157039 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.161748 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-qpwg6"] Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.212787 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.212852 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw56d\" (UniqueName: \"kubernetes.io/projected/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-kube-api-access-tw56d\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.212896 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.212929 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.212949 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-sasl-users\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.213052 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.213131 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-sasl-config\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.314596 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.314655 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-sasl-config\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.314716 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.314738 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw56d\" (UniqueName: \"kubernetes.io/projected/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-kube-api-access-tw56d\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.314759 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.314786 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.314805 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-sasl-users\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.316214 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-sasl-config\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.319535 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.320522 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.320662 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.320602 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.339046 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-sasl-users\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.344669 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw56d\" (UniqueName: \"kubernetes.io/projected/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-kube-api-access-tw56d\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.377362 4713 generic.go:334] "Generic (PLEG): container finished" podID="fff80c8a-de9a-483b-8be3-5ce1423649cb" containerID="58a27c65a5ae34c4e07d3676ebb3c90914314f1fcea054bbe537214aa2b27e54" exitCode=0 Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.377663 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" event={"ID":"fff80c8a-de9a-483b-8be3-5ce1423649cb","Type":"ContainerDied","Data":"58a27c65a5ae34c4e07d3676ebb3c90914314f1fcea054bbe537214aa2b27e54"} Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.378478 4713 scope.go:117] "RemoveContainer" containerID="58a27c65a5ae34c4e07d3676ebb3c90914314f1fcea054bbe537214aa2b27e54" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.381400 4713 generic.go:334] "Generic (PLEG): container finished" podID="367439a6-a382-49f1-b0af-cf399b5a6401" containerID="70e8c69b8363d7dda6445bab94851d9634cebc6b36fa398befdc00186319c707" exitCode=0 Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.381476 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" event={"ID":"367439a6-a382-49f1-b0af-cf399b5a6401","Type":"ContainerDied","Data":"70e8c69b8363d7dda6445bab94851d9634cebc6b36fa398befdc00186319c707"} Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.382994 4713 scope.go:117] "RemoveContainer" containerID="70e8c69b8363d7dda6445bab94851d9634cebc6b36fa398befdc00186319c707" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.388630 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" event={"ID":"a441502e-5d0a-4ec6-ac3c-df20f292efc8","Type":"ContainerStarted","Data":"578dd7fe1589e58e1d385d60d8db2edd769342686802c8b4da7a5cf54a0120fd"} Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.391446 4713 generic.go:334] "Generic (PLEG): container finished" podID="7aaf11cd-f1cf-42c7-9fe9-52880e0af19c" containerID="b2f3a5a9db7bcda7e3be2eea7306d4663f2317fbad21fd29ba1b163bf6d167cd" exitCode=0 Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.391516 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" event={"ID":"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c","Type":"ContainerDied","Data":"b2f3a5a9db7bcda7e3be2eea7306d4663f2317fbad21fd29ba1b163bf6d167cd"} Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.391920 4713 scope.go:117] "RemoveContainer" containerID="b2f3a5a9db7bcda7e3be2eea7306d4663f2317fbad21fd29ba1b163bf6d167cd" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.394929 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" event={"ID":"52ed2487-d016-4930-a9ec-98500bfc0db3","Type":"ContainerDied","Data":"71917d86375943e31a9292ae7412991594bcc498f11ed7d30ee0bdc265d89c06"} Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.395051 4713 scope.go:117] "RemoveContainer" containerID="2ee28c2f8fc1433f9ba19f9c07ab8d85929756f524dd0e86526ecf528ab6aea3" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.394957 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.403024 4713 generic.go:334] "Generic (PLEG): container finished" podID="dc460969-e1ae-4bac-8893-7677ac74787b" containerID="821c5b1c62c2dcde14a74894cdc9009068a9627d2a8c835bc11af48ec9ec9fa1" exitCode=0 Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.403077 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" event={"ID":"dc460969-e1ae-4bac-8893-7677ac74787b","Type":"ContainerDied","Data":"821c5b1c62c2dcde14a74894cdc9009068a9627d2a8c835bc11af48ec9ec9fa1"} Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.403596 4713 scope.go:117] "RemoveContainer" containerID="821c5b1c62c2dcde14a74894cdc9009068a9627d2a8c835bc11af48ec9ec9fa1" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.472569 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.552426 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-t7lzv"] Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.561262 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-t7lzv"] Mar 08 00:35:02 crc kubenswrapper[4713]: I0308 00:35:02.022513 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-qpwg6"] Mar 08 00:35:02 crc kubenswrapper[4713]: W0308 00:35:02.025406 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda45b0eb2_8f38_42e0_8c0a_98a6f453263a.slice/crio-7b32fe186e015d884280c892d44054af0c19013053cb8cb20930cf37119a0a81 WatchSource:0}: Error finding container 7b32fe186e015d884280c892d44054af0c19013053cb8cb20930cf37119a0a81: Status 404 returned error can't find the container with id 7b32fe186e015d884280c892d44054af0c19013053cb8cb20930cf37119a0a81 Mar 08 00:35:02 crc kubenswrapper[4713]: I0308 00:35:02.413507 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" event={"ID":"fff80c8a-de9a-483b-8be3-5ce1423649cb","Type":"ContainerStarted","Data":"e8e35cf9fa960b38dca238f7c8b96ed5b552a38770c5bb83f929694a8f1480f1"} Mar 08 00:35:02 crc kubenswrapper[4713]: I0308 00:35:02.415603 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" event={"ID":"a45b0eb2-8f38-42e0-8c0a-98a6f453263a","Type":"ContainerStarted","Data":"aa0612733b05d7027a684d1a0b180dae0fe589898783b419564190e9bedaa400"} Mar 08 00:35:02 crc kubenswrapper[4713]: I0308 00:35:02.415723 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" event={"ID":"a45b0eb2-8f38-42e0-8c0a-98a6f453263a","Type":"ContainerStarted","Data":"7b32fe186e015d884280c892d44054af0c19013053cb8cb20930cf37119a0a81"} Mar 08 00:35:02 crc kubenswrapper[4713]: I0308 00:35:02.418730 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" event={"ID":"367439a6-a382-49f1-b0af-cf399b5a6401","Type":"ContainerStarted","Data":"bc221c0389f357e012f607860356b103e99a5311c97bbd49bf5c2b82612f9fba"} Mar 08 00:35:02 crc kubenswrapper[4713]: I0308 00:35:02.422010 4713 generic.go:334] "Generic (PLEG): container finished" podID="a441502e-5d0a-4ec6-ac3c-df20f292efc8" containerID="578dd7fe1589e58e1d385d60d8db2edd769342686802c8b4da7a5cf54a0120fd" exitCode=0 Mar 08 00:35:02 crc kubenswrapper[4713]: I0308 00:35:02.422132 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" event={"ID":"a441502e-5d0a-4ec6-ac3c-df20f292efc8","Type":"ContainerDied","Data":"578dd7fe1589e58e1d385d60d8db2edd769342686802c8b4da7a5cf54a0120fd"} Mar 08 00:35:02 crc kubenswrapper[4713]: I0308 00:35:02.422290 4713 scope.go:117] "RemoveContainer" containerID="6b95db28ad5a98065c5e450da4389e996cc18574d525c6fb99d295022c4eb159" Mar 08 00:35:02 crc kubenswrapper[4713]: I0308 00:35:02.422510 4713 scope.go:117] "RemoveContainer" containerID="578dd7fe1589e58e1d385d60d8db2edd769342686802c8b4da7a5cf54a0120fd" Mar 08 00:35:02 crc kubenswrapper[4713]: E0308 00:35:02.422685 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5_service-telemetry(a441502e-5d0a-4ec6-ac3c-df20f292efc8)\"" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" podUID="a441502e-5d0a-4ec6-ac3c-df20f292efc8" Mar 08 00:35:02 crc kubenswrapper[4713]: I0308 00:35:02.424789 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" event={"ID":"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c","Type":"ContainerStarted","Data":"f22d15b01ac342c9a988dd24cb96db243b978fd6684d586ecc3d821e60a23c8a"} Mar 08 00:35:02 crc kubenswrapper[4713]: I0308 00:35:02.442425 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" event={"ID":"dc460969-e1ae-4bac-8893-7677ac74787b","Type":"ContainerStarted","Data":"4e8228dd7e1505ae76be6137d6aa04f351c5c66796994d3f4fccc27926d99363"} Mar 08 00:35:02 crc kubenswrapper[4713]: I0308 00:35:02.559232 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52ed2487-d016-4930-a9ec-98500bfc0db3" path="/var/lib/kubelet/pods/52ed2487-d016-4930-a9ec-98500bfc0db3/volumes" Mar 08 00:35:02 crc kubenswrapper[4713]: I0308 00:35:02.573020 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" podStartSLOduration=2.57300065 podStartE2EDuration="2.57300065s" podCreationTimestamp="2026-03-08 00:35:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:35:02.535351387 +0000 UTC m=+1756.654983620" watchObservedRunningTime="2026-03-08 00:35:02.57300065 +0000 UTC m=+1756.692632883" Mar 08 00:35:03 crc kubenswrapper[4713]: I0308 00:35:03.451597 4713 generic.go:334] "Generic (PLEG): container finished" podID="7aaf11cd-f1cf-42c7-9fe9-52880e0af19c" containerID="f22d15b01ac342c9a988dd24cb96db243b978fd6684d586ecc3d821e60a23c8a" exitCode=0 Mar 08 00:35:03 crc kubenswrapper[4713]: I0308 00:35:03.451643 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" event={"ID":"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c","Type":"ContainerDied","Data":"f22d15b01ac342c9a988dd24cb96db243b978fd6684d586ecc3d821e60a23c8a"} Mar 08 00:35:03 crc kubenswrapper[4713]: I0308 00:35:03.451922 4713 scope.go:117] "RemoveContainer" containerID="b2f3a5a9db7bcda7e3be2eea7306d4663f2317fbad21fd29ba1b163bf6d167cd" Mar 08 00:35:03 crc kubenswrapper[4713]: I0308 00:35:03.452484 4713 scope.go:117] "RemoveContainer" containerID="f22d15b01ac342c9a988dd24cb96db243b978fd6684d586ecc3d821e60a23c8a" Mar 08 00:35:03 crc kubenswrapper[4713]: E0308 00:35:03.452838 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq_service-telemetry(7aaf11cd-f1cf-42c7-9fe9-52880e0af19c)\"" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" podUID="7aaf11cd-f1cf-42c7-9fe9-52880e0af19c" Mar 08 00:35:03 crc kubenswrapper[4713]: I0308 00:35:03.454440 4713 generic.go:334] "Generic (PLEG): container finished" podID="dc460969-e1ae-4bac-8893-7677ac74787b" containerID="4e8228dd7e1505ae76be6137d6aa04f351c5c66796994d3f4fccc27926d99363" exitCode=0 Mar 08 00:35:03 crc kubenswrapper[4713]: I0308 00:35:03.454481 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" event={"ID":"dc460969-e1ae-4bac-8893-7677ac74787b","Type":"ContainerDied","Data":"4e8228dd7e1505ae76be6137d6aa04f351c5c66796994d3f4fccc27926d99363"} Mar 08 00:35:03 crc kubenswrapper[4713]: I0308 00:35:03.454748 4713 scope.go:117] "RemoveContainer" containerID="4e8228dd7e1505ae76be6137d6aa04f351c5c66796994d3f4fccc27926d99363" Mar 08 00:35:03 crc kubenswrapper[4713]: E0308 00:35:03.454930 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4_service-telemetry(dc460969-e1ae-4bac-8893-7677ac74787b)\"" pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" podUID="dc460969-e1ae-4bac-8893-7677ac74787b" Mar 08 00:35:03 crc kubenswrapper[4713]: I0308 00:35:03.457302 4713 generic.go:334] "Generic (PLEG): container finished" podID="fff80c8a-de9a-483b-8be3-5ce1423649cb" containerID="e8e35cf9fa960b38dca238f7c8b96ed5b552a38770c5bb83f929694a8f1480f1" exitCode=0 Mar 08 00:35:03 crc kubenswrapper[4713]: I0308 00:35:03.457369 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" event={"ID":"fff80c8a-de9a-483b-8be3-5ce1423649cb","Type":"ContainerDied","Data":"e8e35cf9fa960b38dca238f7c8b96ed5b552a38770c5bb83f929694a8f1480f1"} Mar 08 00:35:03 crc kubenswrapper[4713]: I0308 00:35:03.458245 4713 scope.go:117] "RemoveContainer" containerID="e8e35cf9fa960b38dca238f7c8b96ed5b552a38770c5bb83f929694a8f1480f1" Mar 08 00:35:03 crc kubenswrapper[4713]: E0308 00:35:03.459159 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl_service-telemetry(fff80c8a-de9a-483b-8be3-5ce1423649cb)\"" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" podUID="fff80c8a-de9a-483b-8be3-5ce1423649cb" Mar 08 00:35:03 crc kubenswrapper[4713]: I0308 00:35:03.459696 4713 generic.go:334] "Generic (PLEG): container finished" podID="367439a6-a382-49f1-b0af-cf399b5a6401" containerID="bc221c0389f357e012f607860356b103e99a5311c97bbd49bf5c2b82612f9fba" exitCode=0 Mar 08 00:35:03 crc kubenswrapper[4713]: I0308 00:35:03.459725 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" event={"ID":"367439a6-a382-49f1-b0af-cf399b5a6401","Type":"ContainerDied","Data":"bc221c0389f357e012f607860356b103e99a5311c97bbd49bf5c2b82612f9fba"} Mar 08 00:35:03 crc kubenswrapper[4713]: I0308 00:35:03.460230 4713 scope.go:117] "RemoveContainer" containerID="bc221c0389f357e012f607860356b103e99a5311c97bbd49bf5c2b82612f9fba" Mar 08 00:35:03 crc kubenswrapper[4713]: E0308 00:35:03.460478 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg_service-telemetry(367439a6-a382-49f1-b0af-cf399b5a6401)\"" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" podUID="367439a6-a382-49f1-b0af-cf399b5a6401" Mar 08 00:35:03 crc kubenswrapper[4713]: I0308 00:35:03.494584 4713 scope.go:117] "RemoveContainer" containerID="821c5b1c62c2dcde14a74894cdc9009068a9627d2a8c835bc11af48ec9ec9fa1" Mar 08 00:35:03 crc kubenswrapper[4713]: I0308 00:35:03.536945 4713 scope.go:117] "RemoveContainer" containerID="58a27c65a5ae34c4e07d3676ebb3c90914314f1fcea054bbe537214aa2b27e54" Mar 08 00:35:03 crc kubenswrapper[4713]: I0308 00:35:03.591104 4713 scope.go:117] "RemoveContainer" containerID="70e8c69b8363d7dda6445bab94851d9634cebc6b36fa398befdc00186319c707" Mar 08 00:35:04 crc kubenswrapper[4713]: I0308 00:35:04.541997 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:35:04 crc kubenswrapper[4713]: E0308 00:35:04.542230 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:35:09 crc kubenswrapper[4713]: I0308 00:35:09.145855 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/qdr-test"] Mar 08 00:35:09 crc kubenswrapper[4713]: I0308 00:35:09.147384 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Mar 08 00:35:09 crc kubenswrapper[4713]: I0308 00:35:09.149273 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-selfsigned" Mar 08 00:35:09 crc kubenswrapper[4713]: I0308 00:35:09.149663 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"qdr-test-config" Mar 08 00:35:09 crc kubenswrapper[4713]: I0308 00:35:09.157249 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Mar 08 00:35:09 crc kubenswrapper[4713]: I0308 00:35:09.235591 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/1a97222f-e496-4378-bc3d-6a508f559df7-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"1a97222f-e496-4378-bc3d-6a508f559df7\") " pod="service-telemetry/qdr-test" Mar 08 00:35:09 crc kubenswrapper[4713]: I0308 00:35:09.235713 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djj99\" (UniqueName: \"kubernetes.io/projected/1a97222f-e496-4378-bc3d-6a508f559df7-kube-api-access-djj99\") pod \"qdr-test\" (UID: \"1a97222f-e496-4378-bc3d-6a508f559df7\") " pod="service-telemetry/qdr-test" Mar 08 00:35:09 crc kubenswrapper[4713]: I0308 00:35:09.235777 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/1a97222f-e496-4378-bc3d-6a508f559df7-qdr-test-config\") pod \"qdr-test\" (UID: \"1a97222f-e496-4378-bc3d-6a508f559df7\") " pod="service-telemetry/qdr-test" Mar 08 00:35:09 crc kubenswrapper[4713]: I0308 00:35:09.336670 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/1a97222f-e496-4378-bc3d-6a508f559df7-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"1a97222f-e496-4378-bc3d-6a508f559df7\") " pod="service-telemetry/qdr-test" Mar 08 00:35:09 crc kubenswrapper[4713]: I0308 00:35:09.336767 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djj99\" (UniqueName: \"kubernetes.io/projected/1a97222f-e496-4378-bc3d-6a508f559df7-kube-api-access-djj99\") pod \"qdr-test\" (UID: \"1a97222f-e496-4378-bc3d-6a508f559df7\") " pod="service-telemetry/qdr-test" Mar 08 00:35:09 crc kubenswrapper[4713]: I0308 00:35:09.336807 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/1a97222f-e496-4378-bc3d-6a508f559df7-qdr-test-config\") pod \"qdr-test\" (UID: \"1a97222f-e496-4378-bc3d-6a508f559df7\") " pod="service-telemetry/qdr-test" Mar 08 00:35:09 crc kubenswrapper[4713]: I0308 00:35:09.337509 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/1a97222f-e496-4378-bc3d-6a508f559df7-qdr-test-config\") pod \"qdr-test\" (UID: \"1a97222f-e496-4378-bc3d-6a508f559df7\") " pod="service-telemetry/qdr-test" Mar 08 00:35:09 crc kubenswrapper[4713]: I0308 00:35:09.349539 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/1a97222f-e496-4378-bc3d-6a508f559df7-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"1a97222f-e496-4378-bc3d-6a508f559df7\") " pod="service-telemetry/qdr-test" Mar 08 00:35:09 crc kubenswrapper[4713]: I0308 00:35:09.353031 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djj99\" (UniqueName: \"kubernetes.io/projected/1a97222f-e496-4378-bc3d-6a508f559df7-kube-api-access-djj99\") pod \"qdr-test\" (UID: \"1a97222f-e496-4378-bc3d-6a508f559df7\") " pod="service-telemetry/qdr-test" Mar 08 00:35:09 crc kubenswrapper[4713]: I0308 00:35:09.501430 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Mar 08 00:35:09 crc kubenswrapper[4713]: I0308 00:35:09.930609 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Mar 08 00:35:09 crc kubenswrapper[4713]: W0308 00:35:09.933134 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a97222f_e496_4378_bc3d_6a508f559df7.slice/crio-377b51021a15378245afe1499568ff5fb06b904615911739881ab5364cd49aad WatchSource:0}: Error finding container 377b51021a15378245afe1499568ff5fb06b904615911739881ab5364cd49aad: Status 404 returned error can't find the container with id 377b51021a15378245afe1499568ff5fb06b904615911739881ab5364cd49aad Mar 08 00:35:10 crc kubenswrapper[4713]: I0308 00:35:10.538808 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"1a97222f-e496-4378-bc3d-6a508f559df7","Type":"ContainerStarted","Data":"377b51021a15378245afe1499568ff5fb06b904615911739881ab5364cd49aad"} Mar 08 00:35:14 crc kubenswrapper[4713]: I0308 00:35:14.541549 4713 scope.go:117] "RemoveContainer" containerID="bc221c0389f357e012f607860356b103e99a5311c97bbd49bf5c2b82612f9fba" Mar 08 00:35:14 crc kubenswrapper[4713]: I0308 00:35:14.542212 4713 scope.go:117] "RemoveContainer" containerID="f22d15b01ac342c9a988dd24cb96db243b978fd6684d586ecc3d821e60a23c8a" Mar 08 00:35:15 crc kubenswrapper[4713]: I0308 00:35:15.540650 4713 scope.go:117] "RemoveContainer" containerID="4e8228dd7e1505ae76be6137d6aa04f351c5c66796994d3f4fccc27926d99363" Mar 08 00:35:15 crc kubenswrapper[4713]: I0308 00:35:15.541873 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:35:15 crc kubenswrapper[4713]: E0308 00:35:15.542124 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:35:15 crc kubenswrapper[4713]: I0308 00:35:15.543079 4713 scope.go:117] "RemoveContainer" containerID="e8e35cf9fa960b38dca238f7c8b96ed5b552a38770c5bb83f929694a8f1480f1" Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.540635 4713 scope.go:117] "RemoveContainer" containerID="578dd7fe1589e58e1d385d60d8db2edd769342686802c8b4da7a5cf54a0120fd" Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.601656 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" event={"ID":"367439a6-a382-49f1-b0af-cf399b5a6401","Type":"ContainerStarted","Data":"3f26b3b7d0aaecb8ab2d98362882634e444eeec6faa861014c5228723d9f98df"} Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.603963 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" event={"ID":"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c","Type":"ContainerStarted","Data":"7ca4705ce545532bdbd07579c75173fdf9ca0a5b0116e20b9720c2874be3cbe3"} Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.606292 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" event={"ID":"dc460969-e1ae-4bac-8893-7677ac74787b","Type":"ContainerStarted","Data":"8f1f71ea5ff75a5db89e966feb3b61b330a9abdfe33b187f6554a8fad390f789"} Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.607738 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"1a97222f-e496-4378-bc3d-6a508f559df7","Type":"ContainerStarted","Data":"5b96f2c1fffdca7302dfd1a0a9361137e4e1bec2600143e51fa5ed758a317b03"} Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.609670 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" event={"ID":"fff80c8a-de9a-483b-8be3-5ce1423649cb","Type":"ContainerStarted","Data":"56cfac6e554d14a6ac665137ad59c976d1246f7973cb11f290ac3c5b0d556219"} Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.678279 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/qdr-test" podStartSLOduration=1.556626901 podStartE2EDuration="8.678256207s" podCreationTimestamp="2026-03-08 00:35:09 +0000 UTC" firstStartedPulling="2026-03-08 00:35:09.934691815 +0000 UTC m=+1764.054324048" lastFinishedPulling="2026-03-08 00:35:17.056321121 +0000 UTC m=+1771.175953354" observedRunningTime="2026-03-08 00:35:17.666079505 +0000 UTC m=+1771.785711738" watchObservedRunningTime="2026-03-08 00:35:17.678256207 +0000 UTC m=+1771.797888440" Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.916783 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/stf-smoketest-smoke1-f4r52"] Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.918277 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.920367 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-entrypoint-script" Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.920703 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-healthcheck-log" Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.921151 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-entrypoint-script" Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.921878 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-publisher" Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.922731 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-sensubility-config" Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.922754 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-config" Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.931432 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-f4r52"] Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.975070 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-collectd-config\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.975131 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.975164 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.975179 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-healthcheck-log\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.975207 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqhnv\" (UniqueName: \"kubernetes.io/projected/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-kube-api-access-mqhnv\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.975235 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-ceilometer-publisher\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.975251 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-sensubility-config\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.075880 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-collectd-config\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.075925 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.075957 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.075976 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-healthcheck-log\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.076000 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqhnv\" (UniqueName: \"kubernetes.io/projected/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-kube-api-access-mqhnv\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.076031 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-ceilometer-publisher\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.076047 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-sensubility-config\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.077699 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.077972 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-sensubility-config\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.078041 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-healthcheck-log\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.078851 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.079332 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-collectd-config\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.079713 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-ceilometer-publisher\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.099122 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqhnv\" (UniqueName: \"kubernetes.io/projected/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-kube-api-access-mqhnv\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.236552 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.258592 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/curl"] Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.259639 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.279038 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxl98\" (UniqueName: \"kubernetes.io/projected/bf866ca9-19cc-4b26-96ae-370b911a5776-kube-api-access-gxl98\") pod \"curl\" (UID: \"bf866ca9-19cc-4b26-96ae-370b911a5776\") " pod="service-telemetry/curl" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.287170 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.385864 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxl98\" (UniqueName: \"kubernetes.io/projected/bf866ca9-19cc-4b26-96ae-370b911a5776-kube-api-access-gxl98\") pod \"curl\" (UID: \"bf866ca9-19cc-4b26-96ae-370b911a5776\") " pod="service-telemetry/curl" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.421615 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxl98\" (UniqueName: \"kubernetes.io/projected/bf866ca9-19cc-4b26-96ae-370b911a5776-kube-api-access-gxl98\") pod \"curl\" (UID: \"bf866ca9-19cc-4b26-96ae-370b911a5776\") " pod="service-telemetry/curl" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.491911 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-f4r52"] Mar 08 00:35:18 crc kubenswrapper[4713]: W0308 00:35:18.500522 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cbd55a3_d3b0_4c65_8b16_a7a9e2a8c033.slice/crio-2274211d96b4a09a6384930c81dd8eccdd8ac1f49e3bd4a3703759b4adf3ef1d WatchSource:0}: Error finding container 2274211d96b4a09a6384930c81dd8eccdd8ac1f49e3bd4a3703759b4adf3ef1d: Status 404 returned error can't find the container with id 2274211d96b4a09a6384930c81dd8eccdd8ac1f49e3bd4a3703759b4adf3ef1d Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.618062 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" event={"ID":"a441502e-5d0a-4ec6-ac3c-df20f292efc8","Type":"ContainerStarted","Data":"ba441f81d307aa312c0fb8a5c4a6e74842c602a014943d0f4bb593e72e761a72"} Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.619202 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-f4r52" event={"ID":"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033","Type":"ContainerStarted","Data":"2274211d96b4a09a6384930c81dd8eccdd8ac1f49e3bd4a3703759b4adf3ef1d"} Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.625135 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.845832 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Mar 08 00:35:18 crc kubenswrapper[4713]: W0308 00:35:18.847437 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf866ca9_19cc_4b26_96ae_370b911a5776.slice/crio-7e5349214c265556ea82d7cb5b315709df49c2e88391b8ab89d9c9bc858b9ccc WatchSource:0}: Error finding container 7e5349214c265556ea82d7cb5b315709df49c2e88391b8ab89d9c9bc858b9ccc: Status 404 returned error can't find the container with id 7e5349214c265556ea82d7cb5b315709df49c2e88391b8ab89d9c9bc858b9ccc Mar 08 00:35:19 crc kubenswrapper[4713]: I0308 00:35:19.628935 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"bf866ca9-19cc-4b26-96ae-370b911a5776","Type":"ContainerStarted","Data":"7e5349214c265556ea82d7cb5b315709df49c2e88391b8ab89d9c9bc858b9ccc"} Mar 08 00:35:22 crc kubenswrapper[4713]: I0308 00:35:22.761719 4713 scope.go:117] "RemoveContainer" containerID="01d9b7b88d08637099f2699ad9a25e90c9327b764008cf2cde4f1f7e06061451" Mar 08 00:35:27 crc kubenswrapper[4713]: I0308 00:35:27.541098 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:35:27 crc kubenswrapper[4713]: E0308 00:35:27.541712 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:35:31 crc kubenswrapper[4713]: I0308 00:35:31.458704 4713 scope.go:117] "RemoveContainer" containerID="0fd1776a90badc7eb6f79de68dfeed110b30a49d06c2f0b0856f0e37b49744ef" Mar 08 00:35:33 crc kubenswrapper[4713]: I0308 00:35:33.733936 4713 generic.go:334] "Generic (PLEG): container finished" podID="bf866ca9-19cc-4b26-96ae-370b911a5776" containerID="63fa6058ccb9e9e70ae07366fd458652f6fd38278fd206e536070f9b9ac066d9" exitCode=0 Mar 08 00:35:33 crc kubenswrapper[4713]: I0308 00:35:33.733992 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"bf866ca9-19cc-4b26-96ae-370b911a5776","Type":"ContainerDied","Data":"63fa6058ccb9e9e70ae07366fd458652f6fd38278fd206e536070f9b9ac066d9"} Mar 08 00:35:33 crc kubenswrapper[4713]: I0308 00:35:33.735734 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-f4r52" event={"ID":"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033","Type":"ContainerStarted","Data":"aa007f126728fe60cfc4386205caadd75377d0ec9c951c570e3220deeb63fde8"} Mar 08 00:35:38 crc kubenswrapper[4713]: I0308 00:35:38.270087 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 08 00:35:38 crc kubenswrapper[4713]: I0308 00:35:38.407335 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_curl_bf866ca9-19cc-4b26-96ae-370b911a5776/curl/0.log" Mar 08 00:35:38 crc kubenswrapper[4713]: I0308 00:35:38.407771 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxl98\" (UniqueName: \"kubernetes.io/projected/bf866ca9-19cc-4b26-96ae-370b911a5776-kube-api-access-gxl98\") pod \"bf866ca9-19cc-4b26-96ae-370b911a5776\" (UID: \"bf866ca9-19cc-4b26-96ae-370b911a5776\") " Mar 08 00:35:38 crc kubenswrapper[4713]: I0308 00:35:38.414191 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf866ca9-19cc-4b26-96ae-370b911a5776-kube-api-access-gxl98" (OuterVolumeSpecName: "kube-api-access-gxl98") pod "bf866ca9-19cc-4b26-96ae-370b911a5776" (UID: "bf866ca9-19cc-4b26-96ae-370b911a5776"). InnerVolumeSpecName "kube-api-access-gxl98". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:35:38 crc kubenswrapper[4713]: I0308 00:35:38.510421 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxl98\" (UniqueName: \"kubernetes.io/projected/bf866ca9-19cc-4b26-96ae-370b911a5776-kube-api-access-gxl98\") on node \"crc\" DevicePath \"\"" Mar 08 00:35:38 crc kubenswrapper[4713]: I0308 00:35:38.644107 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-lfj62_6bdaeb5b-32b1-4454-9a68-0893de41cc75/prometheus-webhook-snmp/0.log" Mar 08 00:35:38 crc kubenswrapper[4713]: I0308 00:35:38.771893 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-f4r52" event={"ID":"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033","Type":"ContainerStarted","Data":"817ec7435e70836cc77b4b156e719fe269db6da434be0baf352b225d4eaf98d6"} Mar 08 00:35:38 crc kubenswrapper[4713]: I0308 00:35:38.773332 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 08 00:35:38 crc kubenswrapper[4713]: I0308 00:35:38.773841 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"bf866ca9-19cc-4b26-96ae-370b911a5776","Type":"ContainerDied","Data":"7e5349214c265556ea82d7cb5b315709df49c2e88391b8ab89d9c9bc858b9ccc"} Mar 08 00:35:38 crc kubenswrapper[4713]: I0308 00:35:38.773862 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e5349214c265556ea82d7cb5b315709df49c2e88391b8ab89d9c9bc858b9ccc" Mar 08 00:35:38 crc kubenswrapper[4713]: I0308 00:35:38.789763 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/stf-smoketest-smoke1-f4r52" podStartSLOduration=1.675689631 podStartE2EDuration="21.789747418s" podCreationTimestamp="2026-03-08 00:35:17 +0000 UTC" firstStartedPulling="2026-03-08 00:35:18.502928596 +0000 UTC m=+1772.622560829" lastFinishedPulling="2026-03-08 00:35:38.616986383 +0000 UTC m=+1792.736618616" observedRunningTime="2026-03-08 00:35:38.788171736 +0000 UTC m=+1792.907803969" watchObservedRunningTime="2026-03-08 00:35:38.789747418 +0000 UTC m=+1792.909379641" Mar 08 00:35:39 crc kubenswrapper[4713]: I0308 00:35:39.542352 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:35:39 crc kubenswrapper[4713]: E0308 00:35:39.542962 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:35:52 crc kubenswrapper[4713]: I0308 00:35:52.541302 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:35:52 crc kubenswrapper[4713]: E0308 00:35:52.542007 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:36:00 crc kubenswrapper[4713]: I0308 00:36:00.135681 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548836-wg7kn"] Mar 08 00:36:00 crc kubenswrapper[4713]: E0308 00:36:00.136524 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf866ca9-19cc-4b26-96ae-370b911a5776" containerName="curl" Mar 08 00:36:00 crc kubenswrapper[4713]: I0308 00:36:00.136536 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf866ca9-19cc-4b26-96ae-370b911a5776" containerName="curl" Mar 08 00:36:00 crc kubenswrapper[4713]: I0308 00:36:00.136668 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf866ca9-19cc-4b26-96ae-370b911a5776" containerName="curl" Mar 08 00:36:00 crc kubenswrapper[4713]: I0308 00:36:00.137125 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548836-wg7kn" Mar 08 00:36:00 crc kubenswrapper[4713]: I0308 00:36:00.139576 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:36:00 crc kubenswrapper[4713]: I0308 00:36:00.139588 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:36:00 crc kubenswrapper[4713]: I0308 00:36:00.139886 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:36:00 crc kubenswrapper[4713]: I0308 00:36:00.145640 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548836-wg7kn"] Mar 08 00:36:00 crc kubenswrapper[4713]: I0308 00:36:00.218129 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wrp6\" (UniqueName: \"kubernetes.io/projected/90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c-kube-api-access-5wrp6\") pod \"auto-csr-approver-29548836-wg7kn\" (UID: \"90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c\") " pod="openshift-infra/auto-csr-approver-29548836-wg7kn" Mar 08 00:36:00 crc kubenswrapper[4713]: I0308 00:36:00.319302 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wrp6\" (UniqueName: \"kubernetes.io/projected/90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c-kube-api-access-5wrp6\") pod \"auto-csr-approver-29548836-wg7kn\" (UID: \"90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c\") " pod="openshift-infra/auto-csr-approver-29548836-wg7kn" Mar 08 00:36:00 crc kubenswrapper[4713]: I0308 00:36:00.337753 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wrp6\" (UniqueName: \"kubernetes.io/projected/90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c-kube-api-access-5wrp6\") pod \"auto-csr-approver-29548836-wg7kn\" (UID: \"90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c\") " pod="openshift-infra/auto-csr-approver-29548836-wg7kn" Mar 08 00:36:00 crc kubenswrapper[4713]: I0308 00:36:00.462939 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548836-wg7kn" Mar 08 00:36:00 crc kubenswrapper[4713]: I0308 00:36:00.693960 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548836-wg7kn"] Mar 08 00:36:00 crc kubenswrapper[4713]: I0308 00:36:00.936882 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548836-wg7kn" event={"ID":"90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c","Type":"ContainerStarted","Data":"d64b7ce18222926abe2f8743d437dc7c186b8b27d33d8d52c3a94e8de8c80271"} Mar 08 00:36:02 crc kubenswrapper[4713]: I0308 00:36:02.952040 4713 generic.go:334] "Generic (PLEG): container finished" podID="90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c" containerID="d6d99e02f6a45a057a86ce43be270637fd870f48d563905dc65b832b4165b2d6" exitCode=0 Mar 08 00:36:02 crc kubenswrapper[4713]: I0308 00:36:02.952144 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548836-wg7kn" event={"ID":"90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c","Type":"ContainerDied","Data":"d6d99e02f6a45a057a86ce43be270637fd870f48d563905dc65b832b4165b2d6"} Mar 08 00:36:04 crc kubenswrapper[4713]: I0308 00:36:04.237723 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548836-wg7kn" Mar 08 00:36:04 crc kubenswrapper[4713]: I0308 00:36:04.290295 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wrp6\" (UniqueName: \"kubernetes.io/projected/90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c-kube-api-access-5wrp6\") pod \"90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c\" (UID: \"90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c\") " Mar 08 00:36:04 crc kubenswrapper[4713]: I0308 00:36:04.299721 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c-kube-api-access-5wrp6" (OuterVolumeSpecName: "kube-api-access-5wrp6") pod "90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c" (UID: "90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c"). InnerVolumeSpecName "kube-api-access-5wrp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:36:04 crc kubenswrapper[4713]: I0308 00:36:04.392253 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wrp6\" (UniqueName: \"kubernetes.io/projected/90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c-kube-api-access-5wrp6\") on node \"crc\" DevicePath \"\"" Mar 08 00:36:04 crc kubenswrapper[4713]: I0308 00:36:04.971262 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548836-wg7kn" event={"ID":"90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c","Type":"ContainerDied","Data":"d64b7ce18222926abe2f8743d437dc7c186b8b27d33d8d52c3a94e8de8c80271"} Mar 08 00:36:04 crc kubenswrapper[4713]: I0308 00:36:04.971309 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d64b7ce18222926abe2f8743d437dc7c186b8b27d33d8d52c3a94e8de8c80271" Mar 08 00:36:04 crc kubenswrapper[4713]: I0308 00:36:04.971770 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548836-wg7kn" Mar 08 00:36:05 crc kubenswrapper[4713]: I0308 00:36:05.302888 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29548830-csc8c"] Mar 08 00:36:05 crc kubenswrapper[4713]: I0308 00:36:05.308556 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29548830-csc8c"] Mar 08 00:36:06 crc kubenswrapper[4713]: I0308 00:36:06.546845 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:36:06 crc kubenswrapper[4713]: E0308 00:36:06.547314 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:36:06 crc kubenswrapper[4713]: I0308 00:36:06.555026 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b849b06-281c-44be-a061-ca5b3905b3e1" path="/var/lib/kubelet/pods/2b849b06-281c-44be-a061-ca5b3905b3e1/volumes" Mar 08 00:36:06 crc kubenswrapper[4713]: I0308 00:36:06.985037 4713 generic.go:334] "Generic (PLEG): container finished" podID="7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033" containerID="aa007f126728fe60cfc4386205caadd75377d0ec9c951c570e3220deeb63fde8" exitCode=1 Mar 08 00:36:06 crc kubenswrapper[4713]: I0308 00:36:06.985092 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-f4r52" event={"ID":"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033","Type":"ContainerDied","Data":"aa007f126728fe60cfc4386205caadd75377d0ec9c951c570e3220deeb63fde8"} Mar 08 00:36:06 crc kubenswrapper[4713]: I0308 00:36:06.985703 4713 scope.go:117] "RemoveContainer" containerID="aa007f126728fe60cfc4386205caadd75377d0ec9c951c570e3220deeb63fde8" Mar 08 00:36:08 crc kubenswrapper[4713]: I0308 00:36:08.759196 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-lfj62_6bdaeb5b-32b1-4454-9a68-0893de41cc75/prometheus-webhook-snmp/0.log" Mar 08 00:36:11 crc kubenswrapper[4713]: I0308 00:36:11.017142 4713 generic.go:334] "Generic (PLEG): container finished" podID="7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033" containerID="817ec7435e70836cc77b4b156e719fe269db6da434be0baf352b225d4eaf98d6" exitCode=0 Mar 08 00:36:11 crc kubenswrapper[4713]: I0308 00:36:11.017191 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-f4r52" event={"ID":"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033","Type":"ContainerDied","Data":"817ec7435e70836cc77b4b156e719fe269db6da434be0baf352b225d4eaf98d6"} Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.361243 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.408120 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-sensubility-config\") pod \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.408225 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-healthcheck-log\") pod \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.408286 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-collectd-entrypoint-script\") pod \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.408353 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-ceilometer-entrypoint-script\") pod \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.408425 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqhnv\" (UniqueName: \"kubernetes.io/projected/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-kube-api-access-mqhnv\") pod \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.408505 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-collectd-config\") pod \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.408559 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-ceilometer-publisher\") pod \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.417145 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-kube-api-access-mqhnv" (OuterVolumeSpecName: "kube-api-access-mqhnv") pod "7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033" (UID: "7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033"). InnerVolumeSpecName "kube-api-access-mqhnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.426947 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-collectd-entrypoint-script" (OuterVolumeSpecName: "collectd-entrypoint-script") pod "7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033" (UID: "7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033"). InnerVolumeSpecName "collectd-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.428206 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-sensubility-config" (OuterVolumeSpecName: "sensubility-config") pod "7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033" (UID: "7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033"). InnerVolumeSpecName "sensubility-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.431810 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-ceilometer-entrypoint-script" (OuterVolumeSpecName: "ceilometer-entrypoint-script") pod "7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033" (UID: "7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033"). InnerVolumeSpecName "ceilometer-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.433085 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-healthcheck-log" (OuterVolumeSpecName: "healthcheck-log") pod "7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033" (UID: "7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033"). InnerVolumeSpecName "healthcheck-log". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.444723 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-collectd-config" (OuterVolumeSpecName: "collectd-config") pod "7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033" (UID: "7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033"). InnerVolumeSpecName "collectd-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.445771 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-ceilometer-publisher" (OuterVolumeSpecName: "ceilometer-publisher") pod "7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033" (UID: "7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033"). InnerVolumeSpecName "ceilometer-publisher". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.510549 4713 reconciler_common.go:293] "Volume detached for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-sensubility-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.510589 4713 reconciler_common.go:293] "Volume detached for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-healthcheck-log\") on node \"crc\" DevicePath \"\"" Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.510600 4713 reconciler_common.go:293] "Volume detached for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-collectd-entrypoint-script\") on node \"crc\" DevicePath \"\"" Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.510611 4713 reconciler_common.go:293] "Volume detached for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-ceilometer-entrypoint-script\") on node \"crc\" DevicePath \"\"" Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.510620 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqhnv\" (UniqueName: \"kubernetes.io/projected/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-kube-api-access-mqhnv\") on node \"crc\" DevicePath \"\"" Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.510628 4713 reconciler_common.go:293] "Volume detached for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-collectd-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.510636 4713 reconciler_common.go:293] "Volume detached for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-ceilometer-publisher\") on node \"crc\" DevicePath \"\"" Mar 08 00:36:13 crc kubenswrapper[4713]: I0308 00:36:13.038846 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-f4r52" event={"ID":"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033","Type":"ContainerDied","Data":"2274211d96b4a09a6384930c81dd8eccdd8ac1f49e3bd4a3703759b4adf3ef1d"} Mar 08 00:36:13 crc kubenswrapper[4713]: I0308 00:36:13.039184 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2274211d96b4a09a6384930c81dd8eccdd8ac1f49e3bd4a3703759b4adf3ef1d" Mar 08 00:36:13 crc kubenswrapper[4713]: I0308 00:36:13.038897 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:36:17 crc kubenswrapper[4713]: I0308 00:36:17.540670 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:36:17 crc kubenswrapper[4713]: E0308 00:36:17.542120 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.033367 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/stf-smoketest-smoke1-mljxj"] Mar 08 00:36:20 crc kubenswrapper[4713]: E0308 00:36:20.035137 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c" containerName="oc" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.035301 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c" containerName="oc" Mar 08 00:36:20 crc kubenswrapper[4713]: E0308 00:36:20.035450 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033" containerName="smoketest-collectd" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.035519 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033" containerName="smoketest-collectd" Mar 08 00:36:20 crc kubenswrapper[4713]: E0308 00:36:20.035594 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033" containerName="smoketest-ceilometer" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.035658 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033" containerName="smoketest-ceilometer" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.035891 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033" containerName="smoketest-collectd" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.035980 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c" containerName="oc" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.036057 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033" containerName="smoketest-ceilometer" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.036993 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.039848 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-sensubility-config" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.040890 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-entrypoint-script" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.041046 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-healthcheck-log" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.041164 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-publisher" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.041924 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-mljxj"] Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.042287 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-entrypoint-script" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.049387 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-config" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.122189 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-sensubility-config\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.122518 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.122654 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-collectd-config\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.122734 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl4j4\" (UniqueName: \"kubernetes.io/projected/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-kube-api-access-bl4j4\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.122815 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-ceilometer-publisher\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.122927 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-healthcheck-log\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.123035 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.224407 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-healthcheck-log\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.224477 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.224505 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-sensubility-config\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.224522 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.224581 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-collectd-config\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.224602 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl4j4\" (UniqueName: \"kubernetes.io/projected/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-kube-api-access-bl4j4\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.224627 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-ceilometer-publisher\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.225553 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.225570 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-sensubility-config\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.225879 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-collectd-config\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.225892 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-healthcheck-log\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.226002 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-ceilometer-publisher\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.226074 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.266992 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl4j4\" (UniqueName: \"kubernetes.io/projected/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-kube-api-access-bl4j4\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.365996 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.867223 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-mljxj"] Mar 08 00:36:21 crc kubenswrapper[4713]: I0308 00:36:21.105765 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-mljxj" event={"ID":"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395","Type":"ContainerStarted","Data":"08db7e7c9c20f3e009a6071feb78142a592f0dfd53b676e63efbb7af03c2e14e"} Mar 08 00:36:21 crc kubenswrapper[4713]: I0308 00:36:21.106209 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-mljxj" event={"ID":"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395","Type":"ContainerStarted","Data":"d03c5bbc7c3a11f8d2df67f7991c4c8325f0d5644a757298cca3f8f3922e00ec"} Mar 08 00:36:22 crc kubenswrapper[4713]: I0308 00:36:22.114357 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-mljxj" event={"ID":"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395","Type":"ContainerStarted","Data":"c0d736c7faa8e82f5582112e3ecb064a9dd96bdd593e760f19d2d40ad9c9415d"} Mar 08 00:36:22 crc kubenswrapper[4713]: I0308 00:36:22.134973 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/stf-smoketest-smoke1-mljxj" podStartSLOduration=2.134954552 podStartE2EDuration="2.134954552s" podCreationTimestamp="2026-03-08 00:36:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:36:22.133123683 +0000 UTC m=+1836.252755916" watchObservedRunningTime="2026-03-08 00:36:22.134954552 +0000 UTC m=+1836.254586805" Mar 08 00:36:31 crc kubenswrapper[4713]: I0308 00:36:31.541606 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:36:31 crc kubenswrapper[4713]: E0308 00:36:31.542408 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:36:33 crc kubenswrapper[4713]: I0308 00:36:33.110226 4713 scope.go:117] "RemoveContainer" containerID="5ffd3bb6cf22ba954a7e67226be2ca668fd3bb44939915e41b40c3c5cd452879" Mar 08 00:36:46 crc kubenswrapper[4713]: I0308 00:36:46.546926 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:36:46 crc kubenswrapper[4713]: E0308 00:36:46.548134 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:36:53 crc kubenswrapper[4713]: I0308 00:36:53.364383 4713 generic.go:334] "Generic (PLEG): container finished" podID="c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395" containerID="c0d736c7faa8e82f5582112e3ecb064a9dd96bdd593e760f19d2d40ad9c9415d" exitCode=0 Mar 08 00:36:53 crc kubenswrapper[4713]: I0308 00:36:53.364483 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-mljxj" event={"ID":"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395","Type":"ContainerDied","Data":"c0d736c7faa8e82f5582112e3ecb064a9dd96bdd593e760f19d2d40ad9c9415d"} Mar 08 00:36:53 crc kubenswrapper[4713]: I0308 00:36:53.366052 4713 scope.go:117] "RemoveContainer" containerID="c0d736c7faa8e82f5582112e3ecb064a9dd96bdd593e760f19d2d40ad9c9415d" Mar 08 00:36:55 crc kubenswrapper[4713]: I0308 00:36:55.383479 4713 generic.go:334] "Generic (PLEG): container finished" podID="c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395" containerID="08db7e7c9c20f3e009a6071feb78142a592f0dfd53b676e63efbb7af03c2e14e" exitCode=0 Mar 08 00:36:55 crc kubenswrapper[4713]: I0308 00:36:55.383579 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-mljxj" event={"ID":"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395","Type":"ContainerDied","Data":"08db7e7c9c20f3e009a6071feb78142a592f0dfd53b676e63efbb7af03c2e14e"} Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.640739 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.771235 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-collectd-entrypoint-script\") pod \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.771333 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-ceilometer-entrypoint-script\") pod \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.771390 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-sensubility-config\") pod \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.771423 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-healthcheck-log\") pod \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.771446 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-ceilometer-publisher\") pod \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.771482 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bl4j4\" (UniqueName: \"kubernetes.io/projected/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-kube-api-access-bl4j4\") pod \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.771537 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-collectd-config\") pod \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.784926 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-kube-api-access-bl4j4" (OuterVolumeSpecName: "kube-api-access-bl4j4") pod "c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395" (UID: "c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395"). InnerVolumeSpecName "kube-api-access-bl4j4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.791993 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-collectd-config" (OuterVolumeSpecName: "collectd-config") pod "c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395" (UID: "c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395"). InnerVolumeSpecName "collectd-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.793315 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-ceilometer-publisher" (OuterVolumeSpecName: "ceilometer-publisher") pod "c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395" (UID: "c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395"). InnerVolumeSpecName "ceilometer-publisher". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.797459 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-sensubility-config" (OuterVolumeSpecName: "sensubility-config") pod "c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395" (UID: "c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395"). InnerVolumeSpecName "sensubility-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.799017 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-collectd-entrypoint-script" (OuterVolumeSpecName: "collectd-entrypoint-script") pod "c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395" (UID: "c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395"). InnerVolumeSpecName "collectd-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.806240 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-ceilometer-entrypoint-script" (OuterVolumeSpecName: "ceilometer-entrypoint-script") pod "c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395" (UID: "c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395"). InnerVolumeSpecName "ceilometer-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.808701 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-healthcheck-log" (OuterVolumeSpecName: "healthcheck-log") pod "c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395" (UID: "c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395"). InnerVolumeSpecName "healthcheck-log". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.873612 4713 reconciler_common.go:293] "Volume detached for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-ceilometer-entrypoint-script\") on node \"crc\" DevicePath \"\"" Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.873657 4713 reconciler_common.go:293] "Volume detached for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-sensubility-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.873676 4713 reconciler_common.go:293] "Volume detached for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-healthcheck-log\") on node \"crc\" DevicePath \"\"" Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.873688 4713 reconciler_common.go:293] "Volume detached for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-ceilometer-publisher\") on node \"crc\" DevicePath \"\"" Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.873702 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bl4j4\" (UniqueName: \"kubernetes.io/projected/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-kube-api-access-bl4j4\") on node \"crc\" DevicePath \"\"" Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.873714 4713 reconciler_common.go:293] "Volume detached for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-collectd-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.873729 4713 reconciler_common.go:293] "Volume detached for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-collectd-entrypoint-script\") on node \"crc\" DevicePath \"\"" Mar 08 00:36:57 crc kubenswrapper[4713]: I0308 00:36:57.400449 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-mljxj" event={"ID":"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395","Type":"ContainerDied","Data":"d03c5bbc7c3a11f8d2df67f7991c4c8325f0d5644a757298cca3f8f3922e00ec"} Mar 08 00:36:57 crc kubenswrapper[4713]: I0308 00:36:57.400494 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d03c5bbc7c3a11f8d2df67f7991c4c8325f0d5644a757298cca3f8f3922e00ec" Mar 08 00:36:57 crc kubenswrapper[4713]: I0308 00:36:57.400536 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:57 crc kubenswrapper[4713]: I0308 00:36:57.541009 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:36:57 crc kubenswrapper[4713]: E0308 00:36:57.541306 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:36:58 crc kubenswrapper[4713]: I0308 00:36:58.519403 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-f4r52_7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033/smoketest-collectd/0.log" Mar 08 00:36:58 crc kubenswrapper[4713]: I0308 00:36:58.791062 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-f4r52_7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033/smoketest-ceilometer/0.log" Mar 08 00:36:59 crc kubenswrapper[4713]: I0308 00:36:59.023662 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-interconnect-68864d46cb-qpwg6_a45b0eb2-8f38-42e0-8c0a-98a6f453263a/default-interconnect/0.log" Mar 08 00:36:59 crc kubenswrapper[4713]: I0308 00:36:59.333491 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq_7aaf11cd-f1cf-42c7-9fe9-52880e0af19c/bridge/2.log" Mar 08 00:36:59 crc kubenswrapper[4713]: I0308 00:36:59.568537 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq_7aaf11cd-f1cf-42c7-9fe9-52880e0af19c/sg-core/0.log" Mar 08 00:36:59 crc kubenswrapper[4713]: I0308 00:36:59.872823 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4_dc460969-e1ae-4bac-8893-7677ac74787b/bridge/2.log" Mar 08 00:37:00 crc kubenswrapper[4713]: I0308 00:37:00.113454 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4_dc460969-e1ae-4bac-8893-7677ac74787b/sg-core/0.log" Mar 08 00:37:00 crc kubenswrapper[4713]: I0308 00:37:00.371610 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg_367439a6-a382-49f1-b0af-cf399b5a6401/bridge/2.log" Mar 08 00:37:00 crc kubenswrapper[4713]: I0308 00:37:00.602674 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg_367439a6-a382-49f1-b0af-cf399b5a6401/sg-core/0.log" Mar 08 00:37:00 crc kubenswrapper[4713]: I0308 00:37:00.829977 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5_a441502e-5d0a-4ec6-ac3c-df20f292efc8/bridge/2.log" Mar 08 00:37:01 crc kubenswrapper[4713]: I0308 00:37:01.093508 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5_a441502e-5d0a-4ec6-ac3c-df20f292efc8/sg-core/0.log" Mar 08 00:37:01 crc kubenswrapper[4713]: I0308 00:37:01.337282 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl_fff80c8a-de9a-483b-8be3-5ce1423649cb/bridge/2.log" Mar 08 00:37:01 crc kubenswrapper[4713]: I0308 00:37:01.601433 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl_fff80c8a-de9a-483b-8be3-5ce1423649cb/sg-core/0.log" Mar 08 00:37:04 crc kubenswrapper[4713]: I0308 00:37:04.626116 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-795859486c-d7k9q_934a7934-e52f-4279-9c2a-4255daf78d5a/operator/0.log" Mar 08 00:37:04 crc kubenswrapper[4713]: I0308 00:37:04.846310 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-default-0_cf91b8a6-24ec-4c39-8337-f05acf19e199/prometheus/0.log" Mar 08 00:37:05 crc kubenswrapper[4713]: I0308 00:37:05.085476 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_elasticsearch-es-default-0_c8a16625-a3a9-4404-bf4a-073fc8f621b9/elasticsearch/0.log" Mar 08 00:37:05 crc kubenswrapper[4713]: I0308 00:37:05.317299 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-lfj62_6bdaeb5b-32b1-4454-9a68-0893de41cc75/prometheus-webhook-snmp/0.log" Mar 08 00:37:05 crc kubenswrapper[4713]: I0308 00:37:05.572745 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_alertmanager-default-0_76d6e5d8-8303-43ac-a477-0dfe579adad2/alertmanager/0.log" Mar 08 00:37:10 crc kubenswrapper[4713]: I0308 00:37:10.541770 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:37:10 crc kubenswrapper[4713]: E0308 00:37:10.542502 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:37:19 crc kubenswrapper[4713]: I0308 00:37:19.858873 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-6f9dc9fb4b-dzbm4_c714eef0-0fe5-4836-80e1-c640aa9527e7/operator/0.log" Mar 08 00:37:22 crc kubenswrapper[4713]: I0308 00:37:22.941695 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-795859486c-d7k9q_934a7934-e52f-4279-9c2a-4255daf78d5a/operator/0.log" Mar 08 00:37:23 crc kubenswrapper[4713]: I0308 00:37:23.204018 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_qdr-test_1a97222f-e496-4378-bc3d-6a508f559df7/qdr/0.log" Mar 08 00:37:24 crc kubenswrapper[4713]: I0308 00:37:24.542644 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:37:24 crc kubenswrapper[4713]: E0308 00:37:24.542956 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:37:33 crc kubenswrapper[4713]: I0308 00:37:33.182540 4713 scope.go:117] "RemoveContainer" containerID="fd8002808c5d3f13b3b01cadcdced7f1edb530c711896d763103800ccc5d24e3" Mar 08 00:37:33 crc kubenswrapper[4713]: I0308 00:37:33.209975 4713 scope.go:117] "RemoveContainer" containerID="8a4854cb64f8a7f1201a66c3c0908bf11c24711a500d6939eec4e2631a9a94e6" Mar 08 00:37:33 crc kubenswrapper[4713]: I0308 00:37:33.256137 4713 scope.go:117] "RemoveContainer" containerID="c082051221894646965936ec6155e8aca998188d9e68b92365d5716b581ebfa0" Mar 08 00:37:33 crc kubenswrapper[4713]: I0308 00:37:33.287494 4713 scope.go:117] "RemoveContainer" containerID="a21bd8ee1ac8242c094817b3835b31572654184e63adec117111a47c5246ee20" Mar 08 00:37:38 crc kubenswrapper[4713]: I0308 00:37:38.542156 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:37:38 crc kubenswrapper[4713]: E0308 00:37:38.543096 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:37:46 crc kubenswrapper[4713]: I0308 00:37:46.415697 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-wbrks"] Mar 08 00:37:46 crc kubenswrapper[4713]: E0308 00:37:46.416512 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395" containerName="smoketest-collectd" Mar 08 00:37:46 crc kubenswrapper[4713]: I0308 00:37:46.416526 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395" containerName="smoketest-collectd" Mar 08 00:37:46 crc kubenswrapper[4713]: E0308 00:37:46.416565 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395" containerName="smoketest-ceilometer" Mar 08 00:37:46 crc kubenswrapper[4713]: I0308 00:37:46.416574 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395" containerName="smoketest-ceilometer" Mar 08 00:37:46 crc kubenswrapper[4713]: I0308 00:37:46.416691 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395" containerName="smoketest-ceilometer" Mar 08 00:37:46 crc kubenswrapper[4713]: I0308 00:37:46.416707 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395" containerName="smoketest-collectd" Mar 08 00:37:46 crc kubenswrapper[4713]: I0308 00:37:46.417289 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-wbrks" Mar 08 00:37:46 crc kubenswrapper[4713]: I0308 00:37:46.431793 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-wbrks"] Mar 08 00:37:46 crc kubenswrapper[4713]: I0308 00:37:46.516186 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6vtr\" (UniqueName: \"kubernetes.io/projected/2c1b190e-aa10-4da9-a6a5-2f15cb53e693-kube-api-access-v6vtr\") pod \"infrawatch-operators-wbrks\" (UID: \"2c1b190e-aa10-4da9-a6a5-2f15cb53e693\") " pod="service-telemetry/infrawatch-operators-wbrks" Mar 08 00:37:46 crc kubenswrapper[4713]: I0308 00:37:46.618012 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6vtr\" (UniqueName: \"kubernetes.io/projected/2c1b190e-aa10-4da9-a6a5-2f15cb53e693-kube-api-access-v6vtr\") pod \"infrawatch-operators-wbrks\" (UID: \"2c1b190e-aa10-4da9-a6a5-2f15cb53e693\") " pod="service-telemetry/infrawatch-operators-wbrks" Mar 08 00:37:46 crc kubenswrapper[4713]: I0308 00:37:46.640565 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6vtr\" (UniqueName: \"kubernetes.io/projected/2c1b190e-aa10-4da9-a6a5-2f15cb53e693-kube-api-access-v6vtr\") pod \"infrawatch-operators-wbrks\" (UID: \"2c1b190e-aa10-4da9-a6a5-2f15cb53e693\") " pod="service-telemetry/infrawatch-operators-wbrks" Mar 08 00:37:46 crc kubenswrapper[4713]: I0308 00:37:46.733146 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-wbrks" Mar 08 00:37:46 crc kubenswrapper[4713]: I0308 00:37:46.956588 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-wbrks"] Mar 08 00:37:47 crc kubenswrapper[4713]: I0308 00:37:47.784097 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-wbrks" event={"ID":"2c1b190e-aa10-4da9-a6a5-2f15cb53e693","Type":"ContainerStarted","Data":"f9ffc2d60bb0d0e5df705841daa95640bd01899e3a4977a890d516b178613328"} Mar 08 00:37:47 crc kubenswrapper[4713]: I0308 00:37:47.784403 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-wbrks" event={"ID":"2c1b190e-aa10-4da9-a6a5-2f15cb53e693","Type":"ContainerStarted","Data":"dea16eae43b99de0d35d2efa59657d5e58e913b4d65320c433c0e7ecae5c5694"} Mar 08 00:37:47 crc kubenswrapper[4713]: I0308 00:37:47.803406 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-wbrks" podStartSLOduration=1.696018105 podStartE2EDuration="1.803391576s" podCreationTimestamp="2026-03-08 00:37:46 +0000 UTC" firstStartedPulling="2026-03-08 00:37:46.958076502 +0000 UTC m=+1921.077708735" lastFinishedPulling="2026-03-08 00:37:47.065449963 +0000 UTC m=+1921.185082206" observedRunningTime="2026-03-08 00:37:47.802995836 +0000 UTC m=+1921.922628069" watchObservedRunningTime="2026-03-08 00:37:47.803391576 +0000 UTC m=+1921.923023809" Mar 08 00:37:51 crc kubenswrapper[4713]: I0308 00:37:51.541393 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:37:51 crc kubenswrapper[4713]: E0308 00:37:51.541867 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:37:56 crc kubenswrapper[4713]: I0308 00:37:56.733676 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/infrawatch-operators-wbrks" Mar 08 00:37:56 crc kubenswrapper[4713]: I0308 00:37:56.734266 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/infrawatch-operators-wbrks" Mar 08 00:37:56 crc kubenswrapper[4713]: I0308 00:37:56.765959 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/infrawatch-operators-wbrks" Mar 08 00:37:56 crc kubenswrapper[4713]: I0308 00:37:56.831182 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-cz8lx/must-gather-6ljft"] Mar 08 00:37:56 crc kubenswrapper[4713]: I0308 00:37:56.832222 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cz8lx/must-gather-6ljft" Mar 08 00:37:56 crc kubenswrapper[4713]: I0308 00:37:56.839758 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-cz8lx"/"openshift-service-ca.crt" Mar 08 00:37:56 crc kubenswrapper[4713]: I0308 00:37:56.840941 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-cz8lx"/"kube-root-ca.crt" Mar 08 00:37:56 crc kubenswrapper[4713]: I0308 00:37:56.856906 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cz8lx/must-gather-6ljft"] Mar 08 00:37:56 crc kubenswrapper[4713]: I0308 00:37:56.910223 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/infrawatch-operators-wbrks" Mar 08 00:37:56 crc kubenswrapper[4713]: I0308 00:37:56.975909 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0ea30b1a-51f4-4455-b6eb-d382b491da53-must-gather-output\") pod \"must-gather-6ljft\" (UID: \"0ea30b1a-51f4-4455-b6eb-d382b491da53\") " pod="openshift-must-gather-cz8lx/must-gather-6ljft" Mar 08 00:37:56 crc kubenswrapper[4713]: I0308 00:37:56.976129 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjx78\" (UniqueName: \"kubernetes.io/projected/0ea30b1a-51f4-4455-b6eb-d382b491da53-kube-api-access-qjx78\") pod \"must-gather-6ljft\" (UID: \"0ea30b1a-51f4-4455-b6eb-d382b491da53\") " pod="openshift-must-gather-cz8lx/must-gather-6ljft" Mar 08 00:37:57 crc kubenswrapper[4713]: I0308 00:37:57.039939 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-wbrks"] Mar 08 00:37:57 crc kubenswrapper[4713]: I0308 00:37:57.077339 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0ea30b1a-51f4-4455-b6eb-d382b491da53-must-gather-output\") pod \"must-gather-6ljft\" (UID: \"0ea30b1a-51f4-4455-b6eb-d382b491da53\") " pod="openshift-must-gather-cz8lx/must-gather-6ljft" Mar 08 00:37:57 crc kubenswrapper[4713]: I0308 00:37:57.077477 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjx78\" (UniqueName: \"kubernetes.io/projected/0ea30b1a-51f4-4455-b6eb-d382b491da53-kube-api-access-qjx78\") pod \"must-gather-6ljft\" (UID: \"0ea30b1a-51f4-4455-b6eb-d382b491da53\") " pod="openshift-must-gather-cz8lx/must-gather-6ljft" Mar 08 00:37:57 crc kubenswrapper[4713]: I0308 00:37:57.077866 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0ea30b1a-51f4-4455-b6eb-d382b491da53-must-gather-output\") pod \"must-gather-6ljft\" (UID: \"0ea30b1a-51f4-4455-b6eb-d382b491da53\") " pod="openshift-must-gather-cz8lx/must-gather-6ljft" Mar 08 00:37:57 crc kubenswrapper[4713]: I0308 00:37:57.095044 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjx78\" (UniqueName: \"kubernetes.io/projected/0ea30b1a-51f4-4455-b6eb-d382b491da53-kube-api-access-qjx78\") pod \"must-gather-6ljft\" (UID: \"0ea30b1a-51f4-4455-b6eb-d382b491da53\") " pod="openshift-must-gather-cz8lx/must-gather-6ljft" Mar 08 00:37:57 crc kubenswrapper[4713]: I0308 00:37:57.151138 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cz8lx/must-gather-6ljft" Mar 08 00:37:57 crc kubenswrapper[4713]: I0308 00:37:57.410698 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cz8lx/must-gather-6ljft"] Mar 08 00:37:57 crc kubenswrapper[4713]: I0308 00:37:57.856104 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cz8lx/must-gather-6ljft" event={"ID":"0ea30b1a-51f4-4455-b6eb-d382b491da53","Type":"ContainerStarted","Data":"979e6681fbe6dac3ab631be1c477b1d0c3781d5561bf4ce433f7f61ccc039d85"} Mar 08 00:37:58 crc kubenswrapper[4713]: I0308 00:37:58.865797 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/infrawatch-operators-wbrks" podUID="2c1b190e-aa10-4da9-a6a5-2f15cb53e693" containerName="registry-server" containerID="cri-o://f9ffc2d60bb0d0e5df705841daa95640bd01899e3a4977a890d516b178613328" gracePeriod=2 Mar 08 00:37:59 crc kubenswrapper[4713]: I0308 00:37:59.277526 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-wbrks" Mar 08 00:37:59 crc kubenswrapper[4713]: I0308 00:37:59.339092 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6vtr\" (UniqueName: \"kubernetes.io/projected/2c1b190e-aa10-4da9-a6a5-2f15cb53e693-kube-api-access-v6vtr\") pod \"2c1b190e-aa10-4da9-a6a5-2f15cb53e693\" (UID: \"2c1b190e-aa10-4da9-a6a5-2f15cb53e693\") " Mar 08 00:37:59 crc kubenswrapper[4713]: I0308 00:37:59.345273 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c1b190e-aa10-4da9-a6a5-2f15cb53e693-kube-api-access-v6vtr" (OuterVolumeSpecName: "kube-api-access-v6vtr") pod "2c1b190e-aa10-4da9-a6a5-2f15cb53e693" (UID: "2c1b190e-aa10-4da9-a6a5-2f15cb53e693"). InnerVolumeSpecName "kube-api-access-v6vtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:37:59 crc kubenswrapper[4713]: I0308 00:37:59.440614 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6vtr\" (UniqueName: \"kubernetes.io/projected/2c1b190e-aa10-4da9-a6a5-2f15cb53e693-kube-api-access-v6vtr\") on node \"crc\" DevicePath \"\"" Mar 08 00:37:59 crc kubenswrapper[4713]: I0308 00:37:59.874192 4713 generic.go:334] "Generic (PLEG): container finished" podID="2c1b190e-aa10-4da9-a6a5-2f15cb53e693" containerID="f9ffc2d60bb0d0e5df705841daa95640bd01899e3a4977a890d516b178613328" exitCode=0 Mar 08 00:37:59 crc kubenswrapper[4713]: I0308 00:37:59.874233 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-wbrks" event={"ID":"2c1b190e-aa10-4da9-a6a5-2f15cb53e693","Type":"ContainerDied","Data":"f9ffc2d60bb0d0e5df705841daa95640bd01899e3a4977a890d516b178613328"} Mar 08 00:37:59 crc kubenswrapper[4713]: I0308 00:37:59.874260 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-wbrks" event={"ID":"2c1b190e-aa10-4da9-a6a5-2f15cb53e693","Type":"ContainerDied","Data":"dea16eae43b99de0d35d2efa59657d5e58e913b4d65320c433c0e7ecae5c5694"} Mar 08 00:37:59 crc kubenswrapper[4713]: I0308 00:37:59.874277 4713 scope.go:117] "RemoveContainer" containerID="f9ffc2d60bb0d0e5df705841daa95640bd01899e3a4977a890d516b178613328" Mar 08 00:37:59 crc kubenswrapper[4713]: I0308 00:37:59.874370 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-wbrks" Mar 08 00:37:59 crc kubenswrapper[4713]: I0308 00:37:59.910113 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-wbrks"] Mar 08 00:37:59 crc kubenswrapper[4713]: I0308 00:37:59.918461 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/infrawatch-operators-wbrks"] Mar 08 00:38:00 crc kubenswrapper[4713]: E0308 00:38:00.022634 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c1b190e_aa10_4da9_a6a5_2f15cb53e693.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c1b190e_aa10_4da9_a6a5_2f15cb53e693.slice/crio-dea16eae43b99de0d35d2efa59657d5e58e913b4d65320c433c0e7ecae5c5694\": RecentStats: unable to find data in memory cache]" Mar 08 00:38:00 crc kubenswrapper[4713]: I0308 00:38:00.146109 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548838-2zhvk"] Mar 08 00:38:00 crc kubenswrapper[4713]: E0308 00:38:00.146459 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c1b190e-aa10-4da9-a6a5-2f15cb53e693" containerName="registry-server" Mar 08 00:38:00 crc kubenswrapper[4713]: I0308 00:38:00.146482 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c1b190e-aa10-4da9-a6a5-2f15cb53e693" containerName="registry-server" Mar 08 00:38:00 crc kubenswrapper[4713]: I0308 00:38:00.146630 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c1b190e-aa10-4da9-a6a5-2f15cb53e693" containerName="registry-server" Mar 08 00:38:00 crc kubenswrapper[4713]: I0308 00:38:00.147194 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548838-2zhvk" Mar 08 00:38:00 crc kubenswrapper[4713]: I0308 00:38:00.149656 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:38:00 crc kubenswrapper[4713]: I0308 00:38:00.149677 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:38:00 crc kubenswrapper[4713]: I0308 00:38:00.149755 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:38:00 crc kubenswrapper[4713]: I0308 00:38:00.160074 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548838-2zhvk"] Mar 08 00:38:00 crc kubenswrapper[4713]: I0308 00:38:00.251597 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d57vh\" (UniqueName: \"kubernetes.io/projected/ca9577aa-e929-4bd9-8056-a85221917ebc-kube-api-access-d57vh\") pod \"auto-csr-approver-29548838-2zhvk\" (UID: \"ca9577aa-e929-4bd9-8056-a85221917ebc\") " pod="openshift-infra/auto-csr-approver-29548838-2zhvk" Mar 08 00:38:00 crc kubenswrapper[4713]: I0308 00:38:00.353111 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d57vh\" (UniqueName: \"kubernetes.io/projected/ca9577aa-e929-4bd9-8056-a85221917ebc-kube-api-access-d57vh\") pod \"auto-csr-approver-29548838-2zhvk\" (UID: \"ca9577aa-e929-4bd9-8056-a85221917ebc\") " pod="openshift-infra/auto-csr-approver-29548838-2zhvk" Mar 08 00:38:00 crc kubenswrapper[4713]: I0308 00:38:00.371320 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d57vh\" (UniqueName: \"kubernetes.io/projected/ca9577aa-e929-4bd9-8056-a85221917ebc-kube-api-access-d57vh\") pod \"auto-csr-approver-29548838-2zhvk\" (UID: \"ca9577aa-e929-4bd9-8056-a85221917ebc\") " pod="openshift-infra/auto-csr-approver-29548838-2zhvk" Mar 08 00:38:00 crc kubenswrapper[4713]: I0308 00:38:00.466869 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548838-2zhvk" Mar 08 00:38:00 crc kubenswrapper[4713]: I0308 00:38:00.549446 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c1b190e-aa10-4da9-a6a5-2f15cb53e693" path="/var/lib/kubelet/pods/2c1b190e-aa10-4da9-a6a5-2f15cb53e693/volumes" Mar 08 00:38:04 crc kubenswrapper[4713]: I0308 00:38:04.399726 4713 scope.go:117] "RemoveContainer" containerID="f9ffc2d60bb0d0e5df705841daa95640bd01899e3a4977a890d516b178613328" Mar 08 00:38:04 crc kubenswrapper[4713]: E0308 00:38:04.400387 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9ffc2d60bb0d0e5df705841daa95640bd01899e3a4977a890d516b178613328\": container with ID starting with f9ffc2d60bb0d0e5df705841daa95640bd01899e3a4977a890d516b178613328 not found: ID does not exist" containerID="f9ffc2d60bb0d0e5df705841daa95640bd01899e3a4977a890d516b178613328" Mar 08 00:38:04 crc kubenswrapper[4713]: I0308 00:38:04.400419 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9ffc2d60bb0d0e5df705841daa95640bd01899e3a4977a890d516b178613328"} err="failed to get container status \"f9ffc2d60bb0d0e5df705841daa95640bd01899e3a4977a890d516b178613328\": rpc error: code = NotFound desc = could not find container \"f9ffc2d60bb0d0e5df705841daa95640bd01899e3a4977a890d516b178613328\": container with ID starting with f9ffc2d60bb0d0e5df705841daa95640bd01899e3a4977a890d516b178613328 not found: ID does not exist" Mar 08 00:38:04 crc kubenswrapper[4713]: I0308 00:38:04.866303 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548838-2zhvk"] Mar 08 00:38:04 crc kubenswrapper[4713]: W0308 00:38:04.869358 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca9577aa_e929_4bd9_8056_a85221917ebc.slice/crio-999f97d2516761c56787c8df22d07ba071ba1de8bc1d85be82b04c8ec0507999 WatchSource:0}: Error finding container 999f97d2516761c56787c8df22d07ba071ba1de8bc1d85be82b04c8ec0507999: Status 404 returned error can't find the container with id 999f97d2516761c56787c8df22d07ba071ba1de8bc1d85be82b04c8ec0507999 Mar 08 00:38:04 crc kubenswrapper[4713]: I0308 00:38:04.872328 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 00:38:04 crc kubenswrapper[4713]: I0308 00:38:04.920402 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548838-2zhvk" event={"ID":"ca9577aa-e929-4bd9-8056-a85221917ebc","Type":"ContainerStarted","Data":"999f97d2516761c56787c8df22d07ba071ba1de8bc1d85be82b04c8ec0507999"} Mar 08 00:38:04 crc kubenswrapper[4713]: I0308 00:38:04.921835 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cz8lx/must-gather-6ljft" event={"ID":"0ea30b1a-51f4-4455-b6eb-d382b491da53","Type":"ContainerStarted","Data":"2e2ae6565c1f19e938373e0180e909754fc32d1e75c3b491b94738a45e6b61d7"} Mar 08 00:38:04 crc kubenswrapper[4713]: I0308 00:38:04.921868 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cz8lx/must-gather-6ljft" event={"ID":"0ea30b1a-51f4-4455-b6eb-d382b491da53","Type":"ContainerStarted","Data":"5d4ba5e09c1289057ca2875f3df44ed349eb2cc42c3d61ea35480f88ee82bfc7"} Mar 08 00:38:04 crc kubenswrapper[4713]: I0308 00:38:04.937239 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-cz8lx/must-gather-6ljft" podStartSLOduration=1.881564291 podStartE2EDuration="8.93722374s" podCreationTimestamp="2026-03-08 00:37:56 +0000 UTC" firstStartedPulling="2026-03-08 00:37:57.42628097 +0000 UTC m=+1931.545913203" lastFinishedPulling="2026-03-08 00:38:04.481940419 +0000 UTC m=+1938.601572652" observedRunningTime="2026-03-08 00:38:04.934677042 +0000 UTC m=+1939.054309285" watchObservedRunningTime="2026-03-08 00:38:04.93722374 +0000 UTC m=+1939.056855973" Mar 08 00:38:06 crc kubenswrapper[4713]: I0308 00:38:06.546729 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:38:06 crc kubenswrapper[4713]: E0308 00:38:06.547413 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:38:06 crc kubenswrapper[4713]: I0308 00:38:06.942589 4713 generic.go:334] "Generic (PLEG): container finished" podID="ca9577aa-e929-4bd9-8056-a85221917ebc" containerID="73f07223497890b1ddebc93a4d6e16e91e4882539ad2021b501bdc3d3d15f480" exitCode=0 Mar 08 00:38:06 crc kubenswrapper[4713]: I0308 00:38:06.942742 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548838-2zhvk" event={"ID":"ca9577aa-e929-4bd9-8056-a85221917ebc","Type":"ContainerDied","Data":"73f07223497890b1ddebc93a4d6e16e91e4882539ad2021b501bdc3d3d15f480"} Mar 08 00:38:08 crc kubenswrapper[4713]: I0308 00:38:08.219881 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548838-2zhvk" Mar 08 00:38:08 crc kubenswrapper[4713]: I0308 00:38:08.379151 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d57vh\" (UniqueName: \"kubernetes.io/projected/ca9577aa-e929-4bd9-8056-a85221917ebc-kube-api-access-d57vh\") pod \"ca9577aa-e929-4bd9-8056-a85221917ebc\" (UID: \"ca9577aa-e929-4bd9-8056-a85221917ebc\") " Mar 08 00:38:08 crc kubenswrapper[4713]: I0308 00:38:08.385745 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca9577aa-e929-4bd9-8056-a85221917ebc-kube-api-access-d57vh" (OuterVolumeSpecName: "kube-api-access-d57vh") pod "ca9577aa-e929-4bd9-8056-a85221917ebc" (UID: "ca9577aa-e929-4bd9-8056-a85221917ebc"). InnerVolumeSpecName "kube-api-access-d57vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:38:08 crc kubenswrapper[4713]: I0308 00:38:08.481146 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d57vh\" (UniqueName: \"kubernetes.io/projected/ca9577aa-e929-4bd9-8056-a85221917ebc-kube-api-access-d57vh\") on node \"crc\" DevicePath \"\"" Mar 08 00:38:08 crc kubenswrapper[4713]: I0308 00:38:08.963952 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548838-2zhvk" event={"ID":"ca9577aa-e929-4bd9-8056-a85221917ebc","Type":"ContainerDied","Data":"999f97d2516761c56787c8df22d07ba071ba1de8bc1d85be82b04c8ec0507999"} Mar 08 00:38:08 crc kubenswrapper[4713]: I0308 00:38:08.964259 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="999f97d2516761c56787c8df22d07ba071ba1de8bc1d85be82b04c8ec0507999" Mar 08 00:38:08 crc kubenswrapper[4713]: I0308 00:38:08.964308 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548838-2zhvk" Mar 08 00:38:09 crc kubenswrapper[4713]: I0308 00:38:09.273434 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29548832-6k4lz"] Mar 08 00:38:09 crc kubenswrapper[4713]: I0308 00:38:09.278465 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29548832-6k4lz"] Mar 08 00:38:10 crc kubenswrapper[4713]: I0308 00:38:10.549242 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0a13b2b-064d-4323-8d5c-d86f76405f38" path="/var/lib/kubelet/pods/d0a13b2b-064d-4323-8d5c-d86f76405f38/volumes" Mar 08 00:38:17 crc kubenswrapper[4713]: I0308 00:38:17.541211 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:38:17 crc kubenswrapper[4713]: E0308 00:38:17.542001 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:38:30 crc kubenswrapper[4713]: I0308 00:38:30.541122 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:38:30 crc kubenswrapper[4713]: E0308 00:38:30.542106 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:38:33 crc kubenswrapper[4713]: I0308 00:38:33.358972 4713 scope.go:117] "RemoveContainer" containerID="d06ee3cd17ca3058dd1d41ca8e61fbdf1a5ff7196264bb612799359dc20d5255" Mar 08 00:38:42 crc kubenswrapper[4713]: I0308 00:38:42.197406 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-7wd77_f878574f-5b4a-4a3f-9b2b-e8eeb569f0fc/control-plane-machine-set-operator/0.log" Mar 08 00:38:42 crc kubenswrapper[4713]: I0308 00:38:42.360043 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-dkkh7_c6893b56-2395-4f91-9349-c23b48b957c8/machine-api-operator/0.log" Mar 08 00:38:42 crc kubenswrapper[4713]: I0308 00:38:42.410539 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-dkkh7_c6893b56-2395-4f91-9349-c23b48b957c8/kube-rbac-proxy/0.log" Mar 08 00:38:44 crc kubenswrapper[4713]: I0308 00:38:44.541880 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:38:45 crc kubenswrapper[4713]: I0308 00:38:45.220747 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerStarted","Data":"80ca810d4dadcdf454d6a3193c471ad78a80c943fa65c9d882400f00b80252cd"} Mar 08 00:38:53 crc kubenswrapper[4713]: I0308 00:38:53.142317 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-gkqzr_d4f51ae9-d2ab-4704-aeeb-5710aceda4f0/cert-manager-controller/0.log" Mar 08 00:38:53 crc kubenswrapper[4713]: I0308 00:38:53.269113 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-9mcfp_1a191145-c818-4e84-8bf3-91145fe9db03/cert-manager-cainjector/0.log" Mar 08 00:38:53 crc kubenswrapper[4713]: I0308 00:38:53.338448 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-qmcpl_2a071bf2-22e7-40f7-976a-74f79abbbd78/cert-manager-webhook/0.log" Mar 08 00:39:06 crc kubenswrapper[4713]: I0308 00:39:06.963413 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-4z5hw_1f48c701-2464-42f6-b2d7-c851ae965f1b/prometheus-operator/0.log" Mar 08 00:39:07 crc kubenswrapper[4713]: I0308 00:39:07.107151 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5_860dc604-80d3-4d4b-8b1e-8a430b706882/prometheus-operator-admission-webhook/0.log" Mar 08 00:39:07 crc kubenswrapper[4713]: I0308 00:39:07.120414 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk_e2152c14-6da7-4f74-a30e-da9e4e7c1acc/prometheus-operator-admission-webhook/0.log" Mar 08 00:39:07 crc kubenswrapper[4713]: I0308 00:39:07.295520 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-v4h4x_f559f6d0-89dc-4d38-807f-491671408dc7/operator/0.log" Mar 08 00:39:07 crc kubenswrapper[4713]: I0308 00:39:07.359542 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-tw72p_3d1a0596-7485-4376-9630-688753a7abd7/perses-operator/0.log" Mar 08 00:39:20 crc kubenswrapper[4713]: I0308 00:39:20.939163 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt_54dbca74-9530-4327-8ede-124dc50096cf/util/0.log" Mar 08 00:39:21 crc kubenswrapper[4713]: I0308 00:39:21.105630 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt_54dbca74-9530-4327-8ede-124dc50096cf/util/0.log" Mar 08 00:39:21 crc kubenswrapper[4713]: I0308 00:39:21.107082 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt_54dbca74-9530-4327-8ede-124dc50096cf/pull/0.log" Mar 08 00:39:21 crc kubenswrapper[4713]: I0308 00:39:21.147202 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt_54dbca74-9530-4327-8ede-124dc50096cf/pull/0.log" Mar 08 00:39:21 crc kubenswrapper[4713]: I0308 00:39:21.304582 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt_54dbca74-9530-4327-8ede-124dc50096cf/util/0.log" Mar 08 00:39:21 crc kubenswrapper[4713]: I0308 00:39:21.313042 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt_54dbca74-9530-4327-8ede-124dc50096cf/pull/0.log" Mar 08 00:39:21 crc kubenswrapper[4713]: I0308 00:39:21.333192 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt_54dbca74-9530-4327-8ede-124dc50096cf/extract/0.log" Mar 08 00:39:21 crc kubenswrapper[4713]: I0308 00:39:21.514332 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p_82947b22-2505-49f0-94e0-039a1a219656/util/0.log" Mar 08 00:39:21 crc kubenswrapper[4713]: I0308 00:39:21.610445 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p_82947b22-2505-49f0-94e0-039a1a219656/util/0.log" Mar 08 00:39:21 crc kubenswrapper[4713]: I0308 00:39:21.649634 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p_82947b22-2505-49f0-94e0-039a1a219656/pull/0.log" Mar 08 00:39:21 crc kubenswrapper[4713]: I0308 00:39:21.668405 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p_82947b22-2505-49f0-94e0-039a1a219656/pull/0.log" Mar 08 00:39:21 crc kubenswrapper[4713]: I0308 00:39:21.800608 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p_82947b22-2505-49f0-94e0-039a1a219656/extract/0.log" Mar 08 00:39:21 crc kubenswrapper[4713]: I0308 00:39:21.802219 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p_82947b22-2505-49f0-94e0-039a1a219656/pull/0.log" Mar 08 00:39:21 crc kubenswrapper[4713]: I0308 00:39:21.828350 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p_82947b22-2505-49f0-94e0-039a1a219656/util/0.log" Mar 08 00:39:21 crc kubenswrapper[4713]: I0308 00:39:21.960944 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw_f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2/util/0.log" Mar 08 00:39:22 crc kubenswrapper[4713]: I0308 00:39:22.086129 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw_f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2/util/0.log" Mar 08 00:39:22 crc kubenswrapper[4713]: I0308 00:39:22.132018 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw_f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2/pull/0.log" Mar 08 00:39:22 crc kubenswrapper[4713]: I0308 00:39:22.133130 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw_f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2/pull/0.log" Mar 08 00:39:22 crc kubenswrapper[4713]: I0308 00:39:22.307519 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw_f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2/extract/0.log" Mar 08 00:39:22 crc kubenswrapper[4713]: I0308 00:39:22.333273 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw_f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2/pull/0.log" Mar 08 00:39:22 crc kubenswrapper[4713]: I0308 00:39:22.337871 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw_f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2/util/0.log" Mar 08 00:39:22 crc kubenswrapper[4713]: I0308 00:39:22.493332 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p_9a95188d-5e62-49d4-851d-08195ed98f4d/util/0.log" Mar 08 00:39:22 crc kubenswrapper[4713]: I0308 00:39:22.658963 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p_9a95188d-5e62-49d4-851d-08195ed98f4d/util/0.log" Mar 08 00:39:22 crc kubenswrapper[4713]: I0308 00:39:22.664372 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p_9a95188d-5e62-49d4-851d-08195ed98f4d/pull/0.log" Mar 08 00:39:22 crc kubenswrapper[4713]: I0308 00:39:22.664465 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p_9a95188d-5e62-49d4-851d-08195ed98f4d/pull/0.log" Mar 08 00:39:22 crc kubenswrapper[4713]: I0308 00:39:22.828802 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p_9a95188d-5e62-49d4-851d-08195ed98f4d/util/0.log" Mar 08 00:39:22 crc kubenswrapper[4713]: I0308 00:39:22.853249 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p_9a95188d-5e62-49d4-851d-08195ed98f4d/pull/0.log" Mar 08 00:39:22 crc kubenswrapper[4713]: I0308 00:39:22.860517 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p_9a95188d-5e62-49d4-851d-08195ed98f4d/extract/0.log" Mar 08 00:39:22 crc kubenswrapper[4713]: I0308 00:39:22.979017 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mn4rt_ce49dca5-e07d-416e-a72d-281928ff343b/extract-utilities/0.log" Mar 08 00:39:23 crc kubenswrapper[4713]: I0308 00:39:23.138891 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mn4rt_ce49dca5-e07d-416e-a72d-281928ff343b/extract-utilities/0.log" Mar 08 00:39:23 crc kubenswrapper[4713]: I0308 00:39:23.181646 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mn4rt_ce49dca5-e07d-416e-a72d-281928ff343b/extract-content/0.log" Mar 08 00:39:23 crc kubenswrapper[4713]: I0308 00:39:23.181688 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mn4rt_ce49dca5-e07d-416e-a72d-281928ff343b/extract-content/0.log" Mar 08 00:39:23 crc kubenswrapper[4713]: I0308 00:39:23.301626 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mn4rt_ce49dca5-e07d-416e-a72d-281928ff343b/extract-utilities/0.log" Mar 08 00:39:23 crc kubenswrapper[4713]: I0308 00:39:23.317372 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mn4rt_ce49dca5-e07d-416e-a72d-281928ff343b/extract-content/0.log" Mar 08 00:39:23 crc kubenswrapper[4713]: I0308 00:39:23.554392 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rc7p9_dd52d225-2e7e-4958-98fc-52028b545353/extract-utilities/0.log" Mar 08 00:39:23 crc kubenswrapper[4713]: I0308 00:39:23.654621 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mn4rt_ce49dca5-e07d-416e-a72d-281928ff343b/registry-server/0.log" Mar 08 00:39:23 crc kubenswrapper[4713]: I0308 00:39:23.708218 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rc7p9_dd52d225-2e7e-4958-98fc-52028b545353/extract-utilities/0.log" Mar 08 00:39:23 crc kubenswrapper[4713]: I0308 00:39:23.789253 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rc7p9_dd52d225-2e7e-4958-98fc-52028b545353/extract-content/0.log" Mar 08 00:39:23 crc kubenswrapper[4713]: I0308 00:39:23.789264 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rc7p9_dd52d225-2e7e-4958-98fc-52028b545353/extract-content/0.log" Mar 08 00:39:23 crc kubenswrapper[4713]: I0308 00:39:23.943432 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rc7p9_dd52d225-2e7e-4958-98fc-52028b545353/extract-content/0.log" Mar 08 00:39:23 crc kubenswrapper[4713]: I0308 00:39:23.963070 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rc7p9_dd52d225-2e7e-4958-98fc-52028b545353/extract-utilities/0.log" Mar 08 00:39:24 crc kubenswrapper[4713]: I0308 00:39:24.147532 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-4bm59_26e0cfc6-458c-4be3-b57c-1cd5fad657c4/marketplace-operator/0.log" Mar 08 00:39:24 crc kubenswrapper[4713]: I0308 00:39:24.261413 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4b75j_47027c84-0848-4140-bed0-b04f627cf6da/extract-utilities/0.log" Mar 08 00:39:24 crc kubenswrapper[4713]: I0308 00:39:24.405917 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rc7p9_dd52d225-2e7e-4958-98fc-52028b545353/registry-server/0.log" Mar 08 00:39:24 crc kubenswrapper[4713]: I0308 00:39:24.455185 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4b75j_47027c84-0848-4140-bed0-b04f627cf6da/extract-utilities/0.log" Mar 08 00:39:24 crc kubenswrapper[4713]: I0308 00:39:24.485778 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4b75j_47027c84-0848-4140-bed0-b04f627cf6da/extract-content/0.log" Mar 08 00:39:24 crc kubenswrapper[4713]: I0308 00:39:24.491017 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4b75j_47027c84-0848-4140-bed0-b04f627cf6da/extract-content/0.log" Mar 08 00:39:24 crc kubenswrapper[4713]: I0308 00:39:24.612857 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4b75j_47027c84-0848-4140-bed0-b04f627cf6da/extract-content/0.log" Mar 08 00:39:24 crc kubenswrapper[4713]: I0308 00:39:24.686439 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4b75j_47027c84-0848-4140-bed0-b04f627cf6da/extract-utilities/0.log" Mar 08 00:39:24 crc kubenswrapper[4713]: I0308 00:39:24.935993 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4b75j_47027c84-0848-4140-bed0-b04f627cf6da/registry-server/0.log" Mar 08 00:39:37 crc kubenswrapper[4713]: I0308 00:39:37.299187 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5_860dc604-80d3-4d4b-8b1e-8a430b706882/prometheus-operator-admission-webhook/0.log" Mar 08 00:39:37 crc kubenswrapper[4713]: I0308 00:39:37.314073 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-4z5hw_1f48c701-2464-42f6-b2d7-c851ae965f1b/prometheus-operator/0.log" Mar 08 00:39:37 crc kubenswrapper[4713]: I0308 00:39:37.408489 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk_e2152c14-6da7-4f74-a30e-da9e4e7c1acc/prometheus-operator-admission-webhook/0.log" Mar 08 00:39:37 crc kubenswrapper[4713]: I0308 00:39:37.474480 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-v4h4x_f559f6d0-89dc-4d38-807f-491671408dc7/operator/0.log" Mar 08 00:39:37 crc kubenswrapper[4713]: I0308 00:39:37.505524 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-tw72p_3d1a0596-7485-4376-9630-688753a7abd7/perses-operator/0.log" Mar 08 00:40:00 crc kubenswrapper[4713]: I0308 00:40:00.147722 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548840-6mm4q"] Mar 08 00:40:00 crc kubenswrapper[4713]: E0308 00:40:00.148500 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca9577aa-e929-4bd9-8056-a85221917ebc" containerName="oc" Mar 08 00:40:00 crc kubenswrapper[4713]: I0308 00:40:00.148513 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca9577aa-e929-4bd9-8056-a85221917ebc" containerName="oc" Mar 08 00:40:00 crc kubenswrapper[4713]: I0308 00:40:00.148650 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca9577aa-e929-4bd9-8056-a85221917ebc" containerName="oc" Mar 08 00:40:00 crc kubenswrapper[4713]: I0308 00:40:00.149101 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548840-6mm4q" Mar 08 00:40:00 crc kubenswrapper[4713]: I0308 00:40:00.151812 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:40:00 crc kubenswrapper[4713]: I0308 00:40:00.151907 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:40:00 crc kubenswrapper[4713]: I0308 00:40:00.152272 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:40:00 crc kubenswrapper[4713]: I0308 00:40:00.162763 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548840-6mm4q"] Mar 08 00:40:00 crc kubenswrapper[4713]: I0308 00:40:00.210611 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpfth\" (UniqueName: \"kubernetes.io/projected/c19c555a-8190-4c25-97c4-3d6b74b4fd7f-kube-api-access-bpfth\") pod \"auto-csr-approver-29548840-6mm4q\" (UID: \"c19c555a-8190-4c25-97c4-3d6b74b4fd7f\") " pod="openshift-infra/auto-csr-approver-29548840-6mm4q" Mar 08 00:40:00 crc kubenswrapper[4713]: I0308 00:40:00.312562 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpfth\" (UniqueName: \"kubernetes.io/projected/c19c555a-8190-4c25-97c4-3d6b74b4fd7f-kube-api-access-bpfth\") pod \"auto-csr-approver-29548840-6mm4q\" (UID: \"c19c555a-8190-4c25-97c4-3d6b74b4fd7f\") " pod="openshift-infra/auto-csr-approver-29548840-6mm4q" Mar 08 00:40:00 crc kubenswrapper[4713]: I0308 00:40:00.330241 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpfth\" (UniqueName: \"kubernetes.io/projected/c19c555a-8190-4c25-97c4-3d6b74b4fd7f-kube-api-access-bpfth\") pod \"auto-csr-approver-29548840-6mm4q\" (UID: \"c19c555a-8190-4c25-97c4-3d6b74b4fd7f\") " pod="openshift-infra/auto-csr-approver-29548840-6mm4q" Mar 08 00:40:00 crc kubenswrapper[4713]: I0308 00:40:00.470855 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548840-6mm4q" Mar 08 00:40:00 crc kubenswrapper[4713]: I0308 00:40:00.688171 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548840-6mm4q"] Mar 08 00:40:00 crc kubenswrapper[4713]: I0308 00:40:00.762936 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548840-6mm4q" event={"ID":"c19c555a-8190-4c25-97c4-3d6b74b4fd7f","Type":"ContainerStarted","Data":"d6759f46bf459f442b5321a8633c93e49983cb7b9bb403c768ed87a665fbbea5"} Mar 08 00:40:02 crc kubenswrapper[4713]: I0308 00:40:02.775992 4713 generic.go:334] "Generic (PLEG): container finished" podID="c19c555a-8190-4c25-97c4-3d6b74b4fd7f" containerID="1b4785721e3b1cd2a3224d4a2879be9724cfc7ed1cc394cbcb9be86d2951adad" exitCode=0 Mar 08 00:40:02 crc kubenswrapper[4713]: I0308 00:40:02.776053 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548840-6mm4q" event={"ID":"c19c555a-8190-4c25-97c4-3d6b74b4fd7f","Type":"ContainerDied","Data":"1b4785721e3b1cd2a3224d4a2879be9724cfc7ed1cc394cbcb9be86d2951adad"} Mar 08 00:40:04 crc kubenswrapper[4713]: I0308 00:40:04.050948 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548840-6mm4q" Mar 08 00:40:04 crc kubenswrapper[4713]: I0308 00:40:04.170922 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpfth\" (UniqueName: \"kubernetes.io/projected/c19c555a-8190-4c25-97c4-3d6b74b4fd7f-kube-api-access-bpfth\") pod \"c19c555a-8190-4c25-97c4-3d6b74b4fd7f\" (UID: \"c19c555a-8190-4c25-97c4-3d6b74b4fd7f\") " Mar 08 00:40:04 crc kubenswrapper[4713]: I0308 00:40:04.175760 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c19c555a-8190-4c25-97c4-3d6b74b4fd7f-kube-api-access-bpfth" (OuterVolumeSpecName: "kube-api-access-bpfth") pod "c19c555a-8190-4c25-97c4-3d6b74b4fd7f" (UID: "c19c555a-8190-4c25-97c4-3d6b74b4fd7f"). InnerVolumeSpecName "kube-api-access-bpfth". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:40:04 crc kubenswrapper[4713]: I0308 00:40:04.272706 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpfth\" (UniqueName: \"kubernetes.io/projected/c19c555a-8190-4c25-97c4-3d6b74b4fd7f-kube-api-access-bpfth\") on node \"crc\" DevicePath \"\"" Mar 08 00:40:04 crc kubenswrapper[4713]: I0308 00:40:04.796775 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548840-6mm4q" event={"ID":"c19c555a-8190-4c25-97c4-3d6b74b4fd7f","Type":"ContainerDied","Data":"d6759f46bf459f442b5321a8633c93e49983cb7b9bb403c768ed87a665fbbea5"} Mar 08 00:40:04 crc kubenswrapper[4713]: I0308 00:40:04.796839 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6759f46bf459f442b5321a8633c93e49983cb7b9bb403c768ed87a665fbbea5" Mar 08 00:40:04 crc kubenswrapper[4713]: I0308 00:40:04.796869 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548840-6mm4q" Mar 08 00:40:05 crc kubenswrapper[4713]: I0308 00:40:05.117429 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29548834-njxhh"] Mar 08 00:40:05 crc kubenswrapper[4713]: I0308 00:40:05.123284 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29548834-njxhh"] Mar 08 00:40:06 crc kubenswrapper[4713]: I0308 00:40:06.551098 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef90820d-fdcc-4ff1-97db-756e8c96851a" path="/var/lib/kubelet/pods/ef90820d-fdcc-4ff1-97db-756e8c96851a/volumes" Mar 08 00:40:16 crc kubenswrapper[4713]: I0308 00:40:16.568734 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bcvn4"] Mar 08 00:40:16 crc kubenswrapper[4713]: E0308 00:40:16.569441 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c19c555a-8190-4c25-97c4-3d6b74b4fd7f" containerName="oc" Mar 08 00:40:16 crc kubenswrapper[4713]: I0308 00:40:16.569455 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c19c555a-8190-4c25-97c4-3d6b74b4fd7f" containerName="oc" Mar 08 00:40:16 crc kubenswrapper[4713]: I0308 00:40:16.569612 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="c19c555a-8190-4c25-97c4-3d6b74b4fd7f" containerName="oc" Mar 08 00:40:16 crc kubenswrapper[4713]: I0308 00:40:16.570642 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bcvn4"] Mar 08 00:40:16 crc kubenswrapper[4713]: I0308 00:40:16.570735 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:16 crc kubenswrapper[4713]: I0308 00:40:16.729449 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-utilities\") pod \"redhat-operators-bcvn4\" (UID: \"b6fd257c-a12b-4c64-b2d2-8f89db2abb10\") " pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:16 crc kubenswrapper[4713]: I0308 00:40:16.730586 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-catalog-content\") pod \"redhat-operators-bcvn4\" (UID: \"b6fd257c-a12b-4c64-b2d2-8f89db2abb10\") " pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:16 crc kubenswrapper[4713]: I0308 00:40:16.730657 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld2m2\" (UniqueName: \"kubernetes.io/projected/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-kube-api-access-ld2m2\") pod \"redhat-operators-bcvn4\" (UID: \"b6fd257c-a12b-4c64-b2d2-8f89db2abb10\") " pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:16 crc kubenswrapper[4713]: I0308 00:40:16.831023 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-utilities\") pod \"redhat-operators-bcvn4\" (UID: \"b6fd257c-a12b-4c64-b2d2-8f89db2abb10\") " pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:16 crc kubenswrapper[4713]: I0308 00:40:16.831136 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-catalog-content\") pod \"redhat-operators-bcvn4\" (UID: \"b6fd257c-a12b-4c64-b2d2-8f89db2abb10\") " pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:16 crc kubenswrapper[4713]: I0308 00:40:16.831182 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld2m2\" (UniqueName: \"kubernetes.io/projected/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-kube-api-access-ld2m2\") pod \"redhat-operators-bcvn4\" (UID: \"b6fd257c-a12b-4c64-b2d2-8f89db2abb10\") " pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:16 crc kubenswrapper[4713]: I0308 00:40:16.832042 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-utilities\") pod \"redhat-operators-bcvn4\" (UID: \"b6fd257c-a12b-4c64-b2d2-8f89db2abb10\") " pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:16 crc kubenswrapper[4713]: I0308 00:40:16.832325 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-catalog-content\") pod \"redhat-operators-bcvn4\" (UID: \"b6fd257c-a12b-4c64-b2d2-8f89db2abb10\") " pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:16 crc kubenswrapper[4713]: I0308 00:40:16.848669 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld2m2\" (UniqueName: \"kubernetes.io/projected/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-kube-api-access-ld2m2\") pod \"redhat-operators-bcvn4\" (UID: \"b6fd257c-a12b-4c64-b2d2-8f89db2abb10\") " pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:16 crc kubenswrapper[4713]: I0308 00:40:16.886422 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:17 crc kubenswrapper[4713]: I0308 00:40:17.385767 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bcvn4"] Mar 08 00:40:17 crc kubenswrapper[4713]: I0308 00:40:17.916977 4713 generic.go:334] "Generic (PLEG): container finished" podID="b6fd257c-a12b-4c64-b2d2-8f89db2abb10" containerID="12b2b31aa0bf208e9c198b4b2d9ae3b9a56439807ce203d6513de8aac0554d26" exitCode=0 Mar 08 00:40:17 crc kubenswrapper[4713]: I0308 00:40:17.917030 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bcvn4" event={"ID":"b6fd257c-a12b-4c64-b2d2-8f89db2abb10","Type":"ContainerDied","Data":"12b2b31aa0bf208e9c198b4b2d9ae3b9a56439807ce203d6513de8aac0554d26"} Mar 08 00:40:17 crc kubenswrapper[4713]: I0308 00:40:17.917060 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bcvn4" event={"ID":"b6fd257c-a12b-4c64-b2d2-8f89db2abb10","Type":"ContainerStarted","Data":"7c2423739e13a478b468599d3a2d1d9b6ec7b44cb5c5a51aec50a5c426275fb3"} Mar 08 00:40:19 crc kubenswrapper[4713]: I0308 00:40:19.935376 4713 generic.go:334] "Generic (PLEG): container finished" podID="b6fd257c-a12b-4c64-b2d2-8f89db2abb10" containerID="c4cccf21a02b5efcddf8b7e62df2239f5b861f141dd7722f8969b75b52f2a5cb" exitCode=0 Mar 08 00:40:19 crc kubenswrapper[4713]: I0308 00:40:19.935486 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bcvn4" event={"ID":"b6fd257c-a12b-4c64-b2d2-8f89db2abb10","Type":"ContainerDied","Data":"c4cccf21a02b5efcddf8b7e62df2239f5b861f141dd7722f8969b75b52f2a5cb"} Mar 08 00:40:20 crc kubenswrapper[4713]: I0308 00:40:20.947811 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bcvn4" event={"ID":"b6fd257c-a12b-4c64-b2d2-8f89db2abb10","Type":"ContainerStarted","Data":"7064f3ab38a1f16525e2bf4e489d38dced71290796194e703d3a80fe5a226c52"} Mar 08 00:40:20 crc kubenswrapper[4713]: I0308 00:40:20.984577 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bcvn4" podStartSLOduration=2.425717648 podStartE2EDuration="4.984539948s" podCreationTimestamp="2026-03-08 00:40:16 +0000 UTC" firstStartedPulling="2026-03-08 00:40:17.918851328 +0000 UTC m=+2072.038483561" lastFinishedPulling="2026-03-08 00:40:20.477673598 +0000 UTC m=+2074.597305861" observedRunningTime="2026-03-08 00:40:20.973501317 +0000 UTC m=+2075.093133580" watchObservedRunningTime="2026-03-08 00:40:20.984539948 +0000 UTC m=+2075.104172191" Mar 08 00:40:26 crc kubenswrapper[4713]: I0308 00:40:26.887086 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:26 crc kubenswrapper[4713]: I0308 00:40:26.887590 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:27 crc kubenswrapper[4713]: I0308 00:40:27.943447 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bcvn4" podUID="b6fd257c-a12b-4c64-b2d2-8f89db2abb10" containerName="registry-server" probeResult="failure" output=< Mar 08 00:40:27 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 08 00:40:27 crc kubenswrapper[4713]: > Mar 08 00:40:32 crc kubenswrapper[4713]: I0308 00:40:32.082476 4713 generic.go:334] "Generic (PLEG): container finished" podID="0ea30b1a-51f4-4455-b6eb-d382b491da53" containerID="5d4ba5e09c1289057ca2875f3df44ed349eb2cc42c3d61ea35480f88ee82bfc7" exitCode=0 Mar 08 00:40:32 crc kubenswrapper[4713]: I0308 00:40:32.082594 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cz8lx/must-gather-6ljft" event={"ID":"0ea30b1a-51f4-4455-b6eb-d382b491da53","Type":"ContainerDied","Data":"5d4ba5e09c1289057ca2875f3df44ed349eb2cc42c3d61ea35480f88ee82bfc7"} Mar 08 00:40:32 crc kubenswrapper[4713]: I0308 00:40:32.084519 4713 scope.go:117] "RemoveContainer" containerID="5d4ba5e09c1289057ca2875f3df44ed349eb2cc42c3d61ea35480f88ee82bfc7" Mar 08 00:40:32 crc kubenswrapper[4713]: I0308 00:40:32.557620 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-cz8lx_must-gather-6ljft_0ea30b1a-51f4-4455-b6eb-d382b491da53/gather/0.log" Mar 08 00:40:33 crc kubenswrapper[4713]: I0308 00:40:33.448654 4713 scope.go:117] "RemoveContainer" containerID="54d98c92ae122fbfe885e4ff1e76b36a0e389e6c7ef0c5d932a7c247396198f3" Mar 08 00:40:36 crc kubenswrapper[4713]: I0308 00:40:36.948051 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:36 crc kubenswrapper[4713]: I0308 00:40:36.997103 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:37 crc kubenswrapper[4713]: I0308 00:40:37.177147 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bcvn4"] Mar 08 00:40:38 crc kubenswrapper[4713]: I0308 00:40:38.131639 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bcvn4" podUID="b6fd257c-a12b-4c64-b2d2-8f89db2abb10" containerName="registry-server" containerID="cri-o://7064f3ab38a1f16525e2bf4e489d38dced71290796194e703d3a80fe5a226c52" gracePeriod=2 Mar 08 00:40:38 crc kubenswrapper[4713]: I0308 00:40:38.480262 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:38 crc kubenswrapper[4713]: I0308 00:40:38.512055 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ld2m2\" (UniqueName: \"kubernetes.io/projected/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-kube-api-access-ld2m2\") pod \"b6fd257c-a12b-4c64-b2d2-8f89db2abb10\" (UID: \"b6fd257c-a12b-4c64-b2d2-8f89db2abb10\") " Mar 08 00:40:38 crc kubenswrapper[4713]: I0308 00:40:38.519860 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-kube-api-access-ld2m2" (OuterVolumeSpecName: "kube-api-access-ld2m2") pod "b6fd257c-a12b-4c64-b2d2-8f89db2abb10" (UID: "b6fd257c-a12b-4c64-b2d2-8f89db2abb10"). InnerVolumeSpecName "kube-api-access-ld2m2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:40:38 crc kubenswrapper[4713]: I0308 00:40:38.613006 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-utilities\") pod \"b6fd257c-a12b-4c64-b2d2-8f89db2abb10\" (UID: \"b6fd257c-a12b-4c64-b2d2-8f89db2abb10\") " Mar 08 00:40:38 crc kubenswrapper[4713]: I0308 00:40:38.613062 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-catalog-content\") pod \"b6fd257c-a12b-4c64-b2d2-8f89db2abb10\" (UID: \"b6fd257c-a12b-4c64-b2d2-8f89db2abb10\") " Mar 08 00:40:38 crc kubenswrapper[4713]: I0308 00:40:38.613535 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ld2m2\" (UniqueName: \"kubernetes.io/projected/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-kube-api-access-ld2m2\") on node \"crc\" DevicePath \"\"" Mar 08 00:40:38 crc kubenswrapper[4713]: I0308 00:40:38.614411 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-utilities" (OuterVolumeSpecName: "utilities") pod "b6fd257c-a12b-4c64-b2d2-8f89db2abb10" (UID: "b6fd257c-a12b-4c64-b2d2-8f89db2abb10"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:40:38 crc kubenswrapper[4713]: I0308 00:40:38.716022 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:40:38 crc kubenswrapper[4713]: I0308 00:40:38.739542 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b6fd257c-a12b-4c64-b2d2-8f89db2abb10" (UID: "b6fd257c-a12b-4c64-b2d2-8f89db2abb10"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:40:38 crc kubenswrapper[4713]: I0308 00:40:38.817354 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.141091 4713 generic.go:334] "Generic (PLEG): container finished" podID="b6fd257c-a12b-4c64-b2d2-8f89db2abb10" containerID="7064f3ab38a1f16525e2bf4e489d38dced71290796194e703d3a80fe5a226c52" exitCode=0 Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.141155 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bcvn4" event={"ID":"b6fd257c-a12b-4c64-b2d2-8f89db2abb10","Type":"ContainerDied","Data":"7064f3ab38a1f16525e2bf4e489d38dced71290796194e703d3a80fe5a226c52"} Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.141703 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bcvn4" event={"ID":"b6fd257c-a12b-4c64-b2d2-8f89db2abb10","Type":"ContainerDied","Data":"7c2423739e13a478b468599d3a2d1d9b6ec7b44cb5c5a51aec50a5c426275fb3"} Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.141209 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.141795 4713 scope.go:117] "RemoveContainer" containerID="7064f3ab38a1f16525e2bf4e489d38dced71290796194e703d3a80fe5a226c52" Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.169847 4713 scope.go:117] "RemoveContainer" containerID="c4cccf21a02b5efcddf8b7e62df2239f5b861f141dd7722f8969b75b52f2a5cb" Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.173359 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bcvn4"] Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.182336 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bcvn4"] Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.201621 4713 scope.go:117] "RemoveContainer" containerID="12b2b31aa0bf208e9c198b4b2d9ae3b9a56439807ce203d6513de8aac0554d26" Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.237659 4713 scope.go:117] "RemoveContainer" containerID="7064f3ab38a1f16525e2bf4e489d38dced71290796194e703d3a80fe5a226c52" Mar 08 00:40:39 crc kubenswrapper[4713]: E0308 00:40:39.238373 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7064f3ab38a1f16525e2bf4e489d38dced71290796194e703d3a80fe5a226c52\": container with ID starting with 7064f3ab38a1f16525e2bf4e489d38dced71290796194e703d3a80fe5a226c52 not found: ID does not exist" containerID="7064f3ab38a1f16525e2bf4e489d38dced71290796194e703d3a80fe5a226c52" Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.238423 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7064f3ab38a1f16525e2bf4e489d38dced71290796194e703d3a80fe5a226c52"} err="failed to get container status \"7064f3ab38a1f16525e2bf4e489d38dced71290796194e703d3a80fe5a226c52\": rpc error: code = NotFound desc = could not find container \"7064f3ab38a1f16525e2bf4e489d38dced71290796194e703d3a80fe5a226c52\": container with ID starting with 7064f3ab38a1f16525e2bf4e489d38dced71290796194e703d3a80fe5a226c52 not found: ID does not exist" Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.238452 4713 scope.go:117] "RemoveContainer" containerID="c4cccf21a02b5efcddf8b7e62df2239f5b861f141dd7722f8969b75b52f2a5cb" Mar 08 00:40:39 crc kubenswrapper[4713]: E0308 00:40:39.238983 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4cccf21a02b5efcddf8b7e62df2239f5b861f141dd7722f8969b75b52f2a5cb\": container with ID starting with c4cccf21a02b5efcddf8b7e62df2239f5b861f141dd7722f8969b75b52f2a5cb not found: ID does not exist" containerID="c4cccf21a02b5efcddf8b7e62df2239f5b861f141dd7722f8969b75b52f2a5cb" Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.239096 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4cccf21a02b5efcddf8b7e62df2239f5b861f141dd7722f8969b75b52f2a5cb"} err="failed to get container status \"c4cccf21a02b5efcddf8b7e62df2239f5b861f141dd7722f8969b75b52f2a5cb\": rpc error: code = NotFound desc = could not find container \"c4cccf21a02b5efcddf8b7e62df2239f5b861f141dd7722f8969b75b52f2a5cb\": container with ID starting with c4cccf21a02b5efcddf8b7e62df2239f5b861f141dd7722f8969b75b52f2a5cb not found: ID does not exist" Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.239207 4713 scope.go:117] "RemoveContainer" containerID="12b2b31aa0bf208e9c198b4b2d9ae3b9a56439807ce203d6513de8aac0554d26" Mar 08 00:40:39 crc kubenswrapper[4713]: E0308 00:40:39.239609 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12b2b31aa0bf208e9c198b4b2d9ae3b9a56439807ce203d6513de8aac0554d26\": container with ID starting with 12b2b31aa0bf208e9c198b4b2d9ae3b9a56439807ce203d6513de8aac0554d26 not found: ID does not exist" containerID="12b2b31aa0bf208e9c198b4b2d9ae3b9a56439807ce203d6513de8aac0554d26" Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.239635 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12b2b31aa0bf208e9c198b4b2d9ae3b9a56439807ce203d6513de8aac0554d26"} err="failed to get container status \"12b2b31aa0bf208e9c198b4b2d9ae3b9a56439807ce203d6513de8aac0554d26\": rpc error: code = NotFound desc = could not find container \"12b2b31aa0bf208e9c198b4b2d9ae3b9a56439807ce203d6513de8aac0554d26\": container with ID starting with 12b2b31aa0bf208e9c198b4b2d9ae3b9a56439807ce203d6513de8aac0554d26 not found: ID does not exist" Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.323683 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-cz8lx/must-gather-6ljft"] Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.323940 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-cz8lx/must-gather-6ljft" podUID="0ea30b1a-51f4-4455-b6eb-d382b491da53" containerName="copy" containerID="cri-o://2e2ae6565c1f19e938373e0180e909754fc32d1e75c3b491b94738a45e6b61d7" gracePeriod=2 Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.342283 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-cz8lx/must-gather-6ljft"] Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.691907 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-cz8lx_must-gather-6ljft_0ea30b1a-51f4-4455-b6eb-d382b491da53/copy/0.log" Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.692945 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cz8lx/must-gather-6ljft" Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.729793 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjx78\" (UniqueName: \"kubernetes.io/projected/0ea30b1a-51f4-4455-b6eb-d382b491da53-kube-api-access-qjx78\") pod \"0ea30b1a-51f4-4455-b6eb-d382b491da53\" (UID: \"0ea30b1a-51f4-4455-b6eb-d382b491da53\") " Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.730093 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0ea30b1a-51f4-4455-b6eb-d382b491da53-must-gather-output\") pod \"0ea30b1a-51f4-4455-b6eb-d382b491da53\" (UID: \"0ea30b1a-51f4-4455-b6eb-d382b491da53\") " Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.736946 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ea30b1a-51f4-4455-b6eb-d382b491da53-kube-api-access-qjx78" (OuterVolumeSpecName: "kube-api-access-qjx78") pod "0ea30b1a-51f4-4455-b6eb-d382b491da53" (UID: "0ea30b1a-51f4-4455-b6eb-d382b491da53"). InnerVolumeSpecName "kube-api-access-qjx78". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.800393 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ea30b1a-51f4-4455-b6eb-d382b491da53-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "0ea30b1a-51f4-4455-b6eb-d382b491da53" (UID: "0ea30b1a-51f4-4455-b6eb-d382b491da53"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.832415 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjx78\" (UniqueName: \"kubernetes.io/projected/0ea30b1a-51f4-4455-b6eb-d382b491da53-kube-api-access-qjx78\") on node \"crc\" DevicePath \"\"" Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.832468 4713 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0ea30b1a-51f4-4455-b6eb-d382b491da53-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 08 00:40:40 crc kubenswrapper[4713]: I0308 00:40:40.152688 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-cz8lx_must-gather-6ljft_0ea30b1a-51f4-4455-b6eb-d382b491da53/copy/0.log" Mar 08 00:40:40 crc kubenswrapper[4713]: I0308 00:40:40.153183 4713 generic.go:334] "Generic (PLEG): container finished" podID="0ea30b1a-51f4-4455-b6eb-d382b491da53" containerID="2e2ae6565c1f19e938373e0180e909754fc32d1e75c3b491b94738a45e6b61d7" exitCode=143 Mar 08 00:40:40 crc kubenswrapper[4713]: I0308 00:40:40.153266 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cz8lx/must-gather-6ljft" Mar 08 00:40:40 crc kubenswrapper[4713]: I0308 00:40:40.153265 4713 scope.go:117] "RemoveContainer" containerID="2e2ae6565c1f19e938373e0180e909754fc32d1e75c3b491b94738a45e6b61d7" Mar 08 00:40:40 crc kubenswrapper[4713]: I0308 00:40:40.170487 4713 scope.go:117] "RemoveContainer" containerID="5d4ba5e09c1289057ca2875f3df44ed349eb2cc42c3d61ea35480f88ee82bfc7" Mar 08 00:40:40 crc kubenswrapper[4713]: I0308 00:40:40.226621 4713 scope.go:117] "RemoveContainer" containerID="2e2ae6565c1f19e938373e0180e909754fc32d1e75c3b491b94738a45e6b61d7" Mar 08 00:40:40 crc kubenswrapper[4713]: E0308 00:40:40.228133 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e2ae6565c1f19e938373e0180e909754fc32d1e75c3b491b94738a45e6b61d7\": container with ID starting with 2e2ae6565c1f19e938373e0180e909754fc32d1e75c3b491b94738a45e6b61d7 not found: ID does not exist" containerID="2e2ae6565c1f19e938373e0180e909754fc32d1e75c3b491b94738a45e6b61d7" Mar 08 00:40:40 crc kubenswrapper[4713]: I0308 00:40:40.228219 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e2ae6565c1f19e938373e0180e909754fc32d1e75c3b491b94738a45e6b61d7"} err="failed to get container status \"2e2ae6565c1f19e938373e0180e909754fc32d1e75c3b491b94738a45e6b61d7\": rpc error: code = NotFound desc = could not find container \"2e2ae6565c1f19e938373e0180e909754fc32d1e75c3b491b94738a45e6b61d7\": container with ID starting with 2e2ae6565c1f19e938373e0180e909754fc32d1e75c3b491b94738a45e6b61d7 not found: ID does not exist" Mar 08 00:40:40 crc kubenswrapper[4713]: I0308 00:40:40.228266 4713 scope.go:117] "RemoveContainer" containerID="5d4ba5e09c1289057ca2875f3df44ed349eb2cc42c3d61ea35480f88ee82bfc7" Mar 08 00:40:40 crc kubenswrapper[4713]: E0308 00:40:40.228733 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d4ba5e09c1289057ca2875f3df44ed349eb2cc42c3d61ea35480f88ee82bfc7\": container with ID starting with 5d4ba5e09c1289057ca2875f3df44ed349eb2cc42c3d61ea35480f88ee82bfc7 not found: ID does not exist" containerID="5d4ba5e09c1289057ca2875f3df44ed349eb2cc42c3d61ea35480f88ee82bfc7" Mar 08 00:40:40 crc kubenswrapper[4713]: I0308 00:40:40.228770 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d4ba5e09c1289057ca2875f3df44ed349eb2cc42c3d61ea35480f88ee82bfc7"} err="failed to get container status \"5d4ba5e09c1289057ca2875f3df44ed349eb2cc42c3d61ea35480f88ee82bfc7\": rpc error: code = NotFound desc = could not find container \"5d4ba5e09c1289057ca2875f3df44ed349eb2cc42c3d61ea35480f88ee82bfc7\": container with ID starting with 5d4ba5e09c1289057ca2875f3df44ed349eb2cc42c3d61ea35480f88ee82bfc7 not found: ID does not exist" Mar 08 00:40:40 crc kubenswrapper[4713]: I0308 00:40:40.549155 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ea30b1a-51f4-4455-b6eb-d382b491da53" path="/var/lib/kubelet/pods/0ea30b1a-51f4-4455-b6eb-d382b491da53/volumes" Mar 08 00:40:40 crc kubenswrapper[4713]: I0308 00:40:40.549790 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6fd257c-a12b-4c64-b2d2-8f89db2abb10" path="/var/lib/kubelet/pods/b6fd257c-a12b-4c64-b2d2-8f89db2abb10/volumes" Mar 08 00:41:04 crc kubenswrapper[4713]: I0308 00:41:04.501378 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:41:04 crc kubenswrapper[4713]: I0308 00:41:04.501765 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.466852 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x94gq"] Mar 08 00:41:10 crc kubenswrapper[4713]: E0308 00:41:10.468198 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6fd257c-a12b-4c64-b2d2-8f89db2abb10" containerName="extract-utilities" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.468257 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6fd257c-a12b-4c64-b2d2-8f89db2abb10" containerName="extract-utilities" Mar 08 00:41:10 crc kubenswrapper[4713]: E0308 00:41:10.468296 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6fd257c-a12b-4c64-b2d2-8f89db2abb10" containerName="extract-content" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.468318 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6fd257c-a12b-4c64-b2d2-8f89db2abb10" containerName="extract-content" Mar 08 00:41:10 crc kubenswrapper[4713]: E0308 00:41:10.468348 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6fd257c-a12b-4c64-b2d2-8f89db2abb10" containerName="registry-server" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.468365 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6fd257c-a12b-4c64-b2d2-8f89db2abb10" containerName="registry-server" Mar 08 00:41:10 crc kubenswrapper[4713]: E0308 00:41:10.468400 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ea30b1a-51f4-4455-b6eb-d382b491da53" containerName="copy" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.468418 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea30b1a-51f4-4455-b6eb-d382b491da53" containerName="copy" Mar 08 00:41:10 crc kubenswrapper[4713]: E0308 00:41:10.468446 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ea30b1a-51f4-4455-b6eb-d382b491da53" containerName="gather" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.468462 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea30b1a-51f4-4455-b6eb-d382b491da53" containerName="gather" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.468781 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ea30b1a-51f4-4455-b6eb-d382b491da53" containerName="copy" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.468819 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6fd257c-a12b-4c64-b2d2-8f89db2abb10" containerName="registry-server" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.468931 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ea30b1a-51f4-4455-b6eb-d382b491da53" containerName="gather" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.471190 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.481319 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x94gq"] Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.612483 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e79129c-88cb-499d-9181-37edfb346e17-catalog-content\") pod \"certified-operators-x94gq\" (UID: \"8e79129c-88cb-499d-9181-37edfb346e17\") " pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.612519 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-792pg\" (UniqueName: \"kubernetes.io/projected/8e79129c-88cb-499d-9181-37edfb346e17-kube-api-access-792pg\") pod \"certified-operators-x94gq\" (UID: \"8e79129c-88cb-499d-9181-37edfb346e17\") " pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.612562 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e79129c-88cb-499d-9181-37edfb346e17-utilities\") pod \"certified-operators-x94gq\" (UID: \"8e79129c-88cb-499d-9181-37edfb346e17\") " pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.713683 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e79129c-88cb-499d-9181-37edfb346e17-catalog-content\") pod \"certified-operators-x94gq\" (UID: \"8e79129c-88cb-499d-9181-37edfb346e17\") " pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.713736 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-792pg\" (UniqueName: \"kubernetes.io/projected/8e79129c-88cb-499d-9181-37edfb346e17-kube-api-access-792pg\") pod \"certified-operators-x94gq\" (UID: \"8e79129c-88cb-499d-9181-37edfb346e17\") " pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.713764 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e79129c-88cb-499d-9181-37edfb346e17-utilities\") pod \"certified-operators-x94gq\" (UID: \"8e79129c-88cb-499d-9181-37edfb346e17\") " pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.714289 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e79129c-88cb-499d-9181-37edfb346e17-catalog-content\") pod \"certified-operators-x94gq\" (UID: \"8e79129c-88cb-499d-9181-37edfb346e17\") " pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.714325 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e79129c-88cb-499d-9181-37edfb346e17-utilities\") pod \"certified-operators-x94gq\" (UID: \"8e79129c-88cb-499d-9181-37edfb346e17\") " pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.733729 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-792pg\" (UniqueName: \"kubernetes.io/projected/8e79129c-88cb-499d-9181-37edfb346e17-kube-api-access-792pg\") pod \"certified-operators-x94gq\" (UID: \"8e79129c-88cb-499d-9181-37edfb346e17\") " pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.816034 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:11 crc kubenswrapper[4713]: I0308 00:41:11.088611 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x94gq"] Mar 08 00:41:11 crc kubenswrapper[4713]: I0308 00:41:11.397836 4713 generic.go:334] "Generic (PLEG): container finished" podID="8e79129c-88cb-499d-9181-37edfb346e17" containerID="8957bf0863a64a7840f9e7ef3e4e990f4e86bfa6eaa480276ba2b25ea26a5a7a" exitCode=0 Mar 08 00:41:11 crc kubenswrapper[4713]: I0308 00:41:11.397874 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x94gq" event={"ID":"8e79129c-88cb-499d-9181-37edfb346e17","Type":"ContainerDied","Data":"8957bf0863a64a7840f9e7ef3e4e990f4e86bfa6eaa480276ba2b25ea26a5a7a"} Mar 08 00:41:11 crc kubenswrapper[4713]: I0308 00:41:11.398201 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x94gq" event={"ID":"8e79129c-88cb-499d-9181-37edfb346e17","Type":"ContainerStarted","Data":"9d1813c737de7699c02f3dc563346797ca3fce4b1d9b6cb161c92b45f6f598f5"} Mar 08 00:41:12 crc kubenswrapper[4713]: I0308 00:41:12.411947 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x94gq" event={"ID":"8e79129c-88cb-499d-9181-37edfb346e17","Type":"ContainerStarted","Data":"7e411cc044053b9669360dfca8db3729354b1136fd362fe8aae4ac11646c1a09"} Mar 08 00:41:13 crc kubenswrapper[4713]: I0308 00:41:13.424140 4713 generic.go:334] "Generic (PLEG): container finished" podID="8e79129c-88cb-499d-9181-37edfb346e17" containerID="7e411cc044053b9669360dfca8db3729354b1136fd362fe8aae4ac11646c1a09" exitCode=0 Mar 08 00:41:13 crc kubenswrapper[4713]: I0308 00:41:13.424203 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x94gq" event={"ID":"8e79129c-88cb-499d-9181-37edfb346e17","Type":"ContainerDied","Data":"7e411cc044053b9669360dfca8db3729354b1136fd362fe8aae4ac11646c1a09"} Mar 08 00:41:14 crc kubenswrapper[4713]: I0308 00:41:14.434323 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x94gq" event={"ID":"8e79129c-88cb-499d-9181-37edfb346e17","Type":"ContainerStarted","Data":"bc30f956115d63b514d5e0f8d546ae9543b9089b03858d7fb6c7d2309983f0ef"} Mar 08 00:41:14 crc kubenswrapper[4713]: I0308 00:41:14.454056 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x94gq" podStartSLOduration=1.793535927 podStartE2EDuration="4.454035981s" podCreationTimestamp="2026-03-08 00:41:10 +0000 UTC" firstStartedPulling="2026-03-08 00:41:11.398956829 +0000 UTC m=+2125.518589062" lastFinishedPulling="2026-03-08 00:41:14.059456873 +0000 UTC m=+2128.179089116" observedRunningTime="2026-03-08 00:41:14.451611687 +0000 UTC m=+2128.571243950" watchObservedRunningTime="2026-03-08 00:41:14.454035981 +0000 UTC m=+2128.573668224" Mar 08 00:41:20 crc kubenswrapper[4713]: I0308 00:41:20.816357 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:20 crc kubenswrapper[4713]: I0308 00:41:20.817347 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:20 crc kubenswrapper[4713]: I0308 00:41:20.885495 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:21 crc kubenswrapper[4713]: I0308 00:41:21.529147 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:21 crc kubenswrapper[4713]: I0308 00:41:21.570791 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x94gq"] Mar 08 00:41:23 crc kubenswrapper[4713]: I0308 00:41:23.506327 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x94gq" podUID="8e79129c-88cb-499d-9181-37edfb346e17" containerName="registry-server" containerID="cri-o://bc30f956115d63b514d5e0f8d546ae9543b9089b03858d7fb6c7d2309983f0ef" gracePeriod=2 Mar 08 00:41:23 crc kubenswrapper[4713]: I0308 00:41:23.853369 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:23 crc kubenswrapper[4713]: I0308 00:41:23.924426 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e79129c-88cb-499d-9181-37edfb346e17-catalog-content\") pod \"8e79129c-88cb-499d-9181-37edfb346e17\" (UID: \"8e79129c-88cb-499d-9181-37edfb346e17\") " Mar 08 00:41:23 crc kubenswrapper[4713]: I0308 00:41:23.924505 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e79129c-88cb-499d-9181-37edfb346e17-utilities\") pod \"8e79129c-88cb-499d-9181-37edfb346e17\" (UID: \"8e79129c-88cb-499d-9181-37edfb346e17\") " Mar 08 00:41:23 crc kubenswrapper[4713]: I0308 00:41:23.924617 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-792pg\" (UniqueName: \"kubernetes.io/projected/8e79129c-88cb-499d-9181-37edfb346e17-kube-api-access-792pg\") pod \"8e79129c-88cb-499d-9181-37edfb346e17\" (UID: \"8e79129c-88cb-499d-9181-37edfb346e17\") " Mar 08 00:41:23 crc kubenswrapper[4713]: I0308 00:41:23.926937 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e79129c-88cb-499d-9181-37edfb346e17-utilities" (OuterVolumeSpecName: "utilities") pod "8e79129c-88cb-499d-9181-37edfb346e17" (UID: "8e79129c-88cb-499d-9181-37edfb346e17"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:41:23 crc kubenswrapper[4713]: I0308 00:41:23.931200 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e79129c-88cb-499d-9181-37edfb346e17-kube-api-access-792pg" (OuterVolumeSpecName: "kube-api-access-792pg") pod "8e79129c-88cb-499d-9181-37edfb346e17" (UID: "8e79129c-88cb-499d-9181-37edfb346e17"). InnerVolumeSpecName "kube-api-access-792pg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:41:23 crc kubenswrapper[4713]: I0308 00:41:23.994316 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e79129c-88cb-499d-9181-37edfb346e17-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e79129c-88cb-499d-9181-37edfb346e17" (UID: "8e79129c-88cb-499d-9181-37edfb346e17"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.026024 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-792pg\" (UniqueName: \"kubernetes.io/projected/8e79129c-88cb-499d-9181-37edfb346e17-kube-api-access-792pg\") on node \"crc\" DevicePath \"\"" Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.026073 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e79129c-88cb-499d-9181-37edfb346e17-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.026092 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e79129c-88cb-499d-9181-37edfb346e17-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.528394 4713 generic.go:334] "Generic (PLEG): container finished" podID="8e79129c-88cb-499d-9181-37edfb346e17" containerID="bc30f956115d63b514d5e0f8d546ae9543b9089b03858d7fb6c7d2309983f0ef" exitCode=0 Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.528490 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.528516 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x94gq" event={"ID":"8e79129c-88cb-499d-9181-37edfb346e17","Type":"ContainerDied","Data":"bc30f956115d63b514d5e0f8d546ae9543b9089b03858d7fb6c7d2309983f0ef"} Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.529926 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x94gq" event={"ID":"8e79129c-88cb-499d-9181-37edfb346e17","Type":"ContainerDied","Data":"9d1813c737de7699c02f3dc563346797ca3fce4b1d9b6cb161c92b45f6f598f5"} Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.529970 4713 scope.go:117] "RemoveContainer" containerID="bc30f956115d63b514d5e0f8d546ae9543b9089b03858d7fb6c7d2309983f0ef" Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.563496 4713 scope.go:117] "RemoveContainer" containerID="7e411cc044053b9669360dfca8db3729354b1136fd362fe8aae4ac11646c1a09" Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.582700 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x94gq"] Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.589524 4713 scope.go:117] "RemoveContainer" containerID="8957bf0863a64a7840f9e7ef3e4e990f4e86bfa6eaa480276ba2b25ea26a5a7a" Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.589663 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x94gq"] Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.612862 4713 scope.go:117] "RemoveContainer" containerID="bc30f956115d63b514d5e0f8d546ae9543b9089b03858d7fb6c7d2309983f0ef" Mar 08 00:41:24 crc kubenswrapper[4713]: E0308 00:41:24.613305 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc30f956115d63b514d5e0f8d546ae9543b9089b03858d7fb6c7d2309983f0ef\": container with ID starting with bc30f956115d63b514d5e0f8d546ae9543b9089b03858d7fb6c7d2309983f0ef not found: ID does not exist" containerID="bc30f956115d63b514d5e0f8d546ae9543b9089b03858d7fb6c7d2309983f0ef" Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.613411 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc30f956115d63b514d5e0f8d546ae9543b9089b03858d7fb6c7d2309983f0ef"} err="failed to get container status \"bc30f956115d63b514d5e0f8d546ae9543b9089b03858d7fb6c7d2309983f0ef\": rpc error: code = NotFound desc = could not find container \"bc30f956115d63b514d5e0f8d546ae9543b9089b03858d7fb6c7d2309983f0ef\": container with ID starting with bc30f956115d63b514d5e0f8d546ae9543b9089b03858d7fb6c7d2309983f0ef not found: ID does not exist" Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.613513 4713 scope.go:117] "RemoveContainer" containerID="7e411cc044053b9669360dfca8db3729354b1136fd362fe8aae4ac11646c1a09" Mar 08 00:41:24 crc kubenswrapper[4713]: E0308 00:41:24.614047 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e411cc044053b9669360dfca8db3729354b1136fd362fe8aae4ac11646c1a09\": container with ID starting with 7e411cc044053b9669360dfca8db3729354b1136fd362fe8aae4ac11646c1a09 not found: ID does not exist" containerID="7e411cc044053b9669360dfca8db3729354b1136fd362fe8aae4ac11646c1a09" Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.614088 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e411cc044053b9669360dfca8db3729354b1136fd362fe8aae4ac11646c1a09"} err="failed to get container status \"7e411cc044053b9669360dfca8db3729354b1136fd362fe8aae4ac11646c1a09\": rpc error: code = NotFound desc = could not find container \"7e411cc044053b9669360dfca8db3729354b1136fd362fe8aae4ac11646c1a09\": container with ID starting with 7e411cc044053b9669360dfca8db3729354b1136fd362fe8aae4ac11646c1a09 not found: ID does not exist" Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.614117 4713 scope.go:117] "RemoveContainer" containerID="8957bf0863a64a7840f9e7ef3e4e990f4e86bfa6eaa480276ba2b25ea26a5a7a" Mar 08 00:41:24 crc kubenswrapper[4713]: E0308 00:41:24.614460 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8957bf0863a64a7840f9e7ef3e4e990f4e86bfa6eaa480276ba2b25ea26a5a7a\": container with ID starting with 8957bf0863a64a7840f9e7ef3e4e990f4e86bfa6eaa480276ba2b25ea26a5a7a not found: ID does not exist" containerID="8957bf0863a64a7840f9e7ef3e4e990f4e86bfa6eaa480276ba2b25ea26a5a7a" Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.614500 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8957bf0863a64a7840f9e7ef3e4e990f4e86bfa6eaa480276ba2b25ea26a5a7a"} err="failed to get container status \"8957bf0863a64a7840f9e7ef3e4e990f4e86bfa6eaa480276ba2b25ea26a5a7a\": rpc error: code = NotFound desc = could not find container \"8957bf0863a64a7840f9e7ef3e4e990f4e86bfa6eaa480276ba2b25ea26a5a7a\": container with ID starting with 8957bf0863a64a7840f9e7ef3e4e990f4e86bfa6eaa480276ba2b25ea26a5a7a not found: ID does not exist" Mar 08 00:41:26 crc kubenswrapper[4713]: I0308 00:41:26.559010 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e79129c-88cb-499d-9181-37edfb346e17" path="/var/lib/kubelet/pods/8e79129c-88cb-499d-9181-37edfb346e17/volumes" Mar 08 00:41:34 crc kubenswrapper[4713]: I0308 00:41:34.501152 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:41:34 crc kubenswrapper[4713]: I0308 00:41:34.501731 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:42:00 crc kubenswrapper[4713]: I0308 00:42:00.148355 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548842-nbbhm"] Mar 08 00:42:00 crc kubenswrapper[4713]: E0308 00:42:00.149528 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e79129c-88cb-499d-9181-37edfb346e17" containerName="registry-server" Mar 08 00:42:00 crc kubenswrapper[4713]: I0308 00:42:00.149552 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e79129c-88cb-499d-9181-37edfb346e17" containerName="registry-server" Mar 08 00:42:00 crc kubenswrapper[4713]: E0308 00:42:00.149599 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e79129c-88cb-499d-9181-37edfb346e17" containerName="extract-content" Mar 08 00:42:00 crc kubenswrapper[4713]: I0308 00:42:00.149616 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e79129c-88cb-499d-9181-37edfb346e17" containerName="extract-content" Mar 08 00:42:00 crc kubenswrapper[4713]: E0308 00:42:00.149639 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e79129c-88cb-499d-9181-37edfb346e17" containerName="extract-utilities" Mar 08 00:42:00 crc kubenswrapper[4713]: I0308 00:42:00.149656 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e79129c-88cb-499d-9181-37edfb346e17" containerName="extract-utilities" Mar 08 00:42:00 crc kubenswrapper[4713]: I0308 00:42:00.149990 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e79129c-88cb-499d-9181-37edfb346e17" containerName="registry-server" Mar 08 00:42:00 crc kubenswrapper[4713]: I0308 00:42:00.150731 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548842-nbbhm" Mar 08 00:42:00 crc kubenswrapper[4713]: I0308 00:42:00.153493 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:42:00 crc kubenswrapper[4713]: I0308 00:42:00.153639 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:42:00 crc kubenswrapper[4713]: I0308 00:42:00.154899 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:42:00 crc kubenswrapper[4713]: I0308 00:42:00.155234 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548842-nbbhm"] Mar 08 00:42:00 crc kubenswrapper[4713]: I0308 00:42:00.169997 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j5jl\" (UniqueName: \"kubernetes.io/projected/9759dcd8-b056-4924-9c1f-96ae6cdd2341-kube-api-access-4j5jl\") pod \"auto-csr-approver-29548842-nbbhm\" (UID: \"9759dcd8-b056-4924-9c1f-96ae6cdd2341\") " pod="openshift-infra/auto-csr-approver-29548842-nbbhm" Mar 08 00:42:00 crc kubenswrapper[4713]: I0308 00:42:00.271022 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j5jl\" (UniqueName: \"kubernetes.io/projected/9759dcd8-b056-4924-9c1f-96ae6cdd2341-kube-api-access-4j5jl\") pod \"auto-csr-approver-29548842-nbbhm\" (UID: \"9759dcd8-b056-4924-9c1f-96ae6cdd2341\") " pod="openshift-infra/auto-csr-approver-29548842-nbbhm" Mar 08 00:42:00 crc kubenswrapper[4713]: I0308 00:42:00.292905 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j5jl\" (UniqueName: \"kubernetes.io/projected/9759dcd8-b056-4924-9c1f-96ae6cdd2341-kube-api-access-4j5jl\") pod \"auto-csr-approver-29548842-nbbhm\" (UID: \"9759dcd8-b056-4924-9c1f-96ae6cdd2341\") " pod="openshift-infra/auto-csr-approver-29548842-nbbhm" Mar 08 00:42:00 crc kubenswrapper[4713]: I0308 00:42:00.479562 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548842-nbbhm" Mar 08 00:42:00 crc kubenswrapper[4713]: I0308 00:42:00.718131 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548842-nbbhm"] Mar 08 00:42:00 crc kubenswrapper[4713]: W0308 00:42:00.722349 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9759dcd8_b056_4924_9c1f_96ae6cdd2341.slice/crio-0d7addbd9e3693867799956c5dee06a018d8f34bc11e6e13bdce5aaa4358aa8a WatchSource:0}: Error finding container 0d7addbd9e3693867799956c5dee06a018d8f34bc11e6e13bdce5aaa4358aa8a: Status 404 returned error can't find the container with id 0d7addbd9e3693867799956c5dee06a018d8f34bc11e6e13bdce5aaa4358aa8a Mar 08 00:42:00 crc kubenswrapper[4713]: I0308 00:42:00.851512 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548842-nbbhm" event={"ID":"9759dcd8-b056-4924-9c1f-96ae6cdd2341","Type":"ContainerStarted","Data":"0d7addbd9e3693867799956c5dee06a018d8f34bc11e6e13bdce5aaa4358aa8a"} Mar 08 00:42:01 crc kubenswrapper[4713]: I0308 00:42:01.859092 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548842-nbbhm" event={"ID":"9759dcd8-b056-4924-9c1f-96ae6cdd2341","Type":"ContainerStarted","Data":"024756081a650cbe1c2fb8388c3bda8daa8b3d8054f170bbd33a09ee491a07fb"} Mar 08 00:42:01 crc kubenswrapper[4713]: I0308 00:42:01.874421 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29548842-nbbhm" podStartSLOduration=1.08364329 podStartE2EDuration="1.87440073s" podCreationTimestamp="2026-03-08 00:42:00 +0000 UTC" firstStartedPulling="2026-03-08 00:42:00.724726062 +0000 UTC m=+2174.844358295" lastFinishedPulling="2026-03-08 00:42:01.515483482 +0000 UTC m=+2175.635115735" observedRunningTime="2026-03-08 00:42:01.87214159 +0000 UTC m=+2175.991773823" watchObservedRunningTime="2026-03-08 00:42:01.87440073 +0000 UTC m=+2175.994032963" Mar 08 00:42:02 crc kubenswrapper[4713]: I0308 00:42:02.870951 4713 generic.go:334] "Generic (PLEG): container finished" podID="9759dcd8-b056-4924-9c1f-96ae6cdd2341" containerID="024756081a650cbe1c2fb8388c3bda8daa8b3d8054f170bbd33a09ee491a07fb" exitCode=0 Mar 08 00:42:02 crc kubenswrapper[4713]: I0308 00:42:02.870997 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548842-nbbhm" event={"ID":"9759dcd8-b056-4924-9c1f-96ae6cdd2341","Type":"ContainerDied","Data":"024756081a650cbe1c2fb8388c3bda8daa8b3d8054f170bbd33a09ee491a07fb"} Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.095027 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548842-nbbhm" Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.227499 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4j5jl\" (UniqueName: \"kubernetes.io/projected/9759dcd8-b056-4924-9c1f-96ae6cdd2341-kube-api-access-4j5jl\") pod \"9759dcd8-b056-4924-9c1f-96ae6cdd2341\" (UID: \"9759dcd8-b056-4924-9c1f-96ae6cdd2341\") " Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.232469 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9759dcd8-b056-4924-9c1f-96ae6cdd2341-kube-api-access-4j5jl" (OuterVolumeSpecName: "kube-api-access-4j5jl") pod "9759dcd8-b056-4924-9c1f-96ae6cdd2341" (UID: "9759dcd8-b056-4924-9c1f-96ae6cdd2341"). InnerVolumeSpecName "kube-api-access-4j5jl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.329018 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4j5jl\" (UniqueName: \"kubernetes.io/projected/9759dcd8-b056-4924-9c1f-96ae6cdd2341-kube-api-access-4j5jl\") on node \"crc\" DevicePath \"\"" Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.501811 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.501966 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.502019 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.502592 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"80ca810d4dadcdf454d6a3193c471ad78a80c943fa65c9d882400f00b80252cd"} pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.502649 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" containerID="cri-o://80ca810d4dadcdf454d6a3193c471ad78a80c943fa65c9d882400f00b80252cd" gracePeriod=600 Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.888577 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548842-nbbhm" event={"ID":"9759dcd8-b056-4924-9c1f-96ae6cdd2341","Type":"ContainerDied","Data":"0d7addbd9e3693867799956c5dee06a018d8f34bc11e6e13bdce5aaa4358aa8a"} Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.888917 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d7addbd9e3693867799956c5dee06a018d8f34bc11e6e13bdce5aaa4358aa8a" Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.888944 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548842-nbbhm" Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.895346 4713 generic.go:334] "Generic (PLEG): container finished" podID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerID="80ca810d4dadcdf454d6a3193c471ad78a80c943fa65c9d882400f00b80252cd" exitCode=0 Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.895383 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerDied","Data":"80ca810d4dadcdf454d6a3193c471ad78a80c943fa65c9d882400f00b80252cd"} Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.895409 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerStarted","Data":"8a96ab182dae708701b9a232e6e12194ed79a11f4ec0534022482994ad49659e"} Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.895426 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.977671 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29548836-wg7kn"] Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.983923 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29548836-wg7kn"] Mar 08 00:42:06 crc kubenswrapper[4713]: I0308 00:42:06.552525 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c" path="/var/lib/kubelet/pods/90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c/volumes" Mar 08 00:42:33 crc kubenswrapper[4713]: I0308 00:42:33.589806 4713 scope.go:117] "RemoveContainer" containerID="d6d99e02f6a45a057a86ce43be270637fd870f48d563905dc65b832b4165b2d6" Mar 08 00:43:08 crc kubenswrapper[4713]: I0308 00:43:08.744088 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-7nqh7"] Mar 08 00:43:08 crc kubenswrapper[4713]: E0308 00:43:08.745033 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9759dcd8-b056-4924-9c1f-96ae6cdd2341" containerName="oc" Mar 08 00:43:08 crc kubenswrapper[4713]: I0308 00:43:08.745052 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="9759dcd8-b056-4924-9c1f-96ae6cdd2341" containerName="oc" Mar 08 00:43:08 crc kubenswrapper[4713]: I0308 00:43:08.745215 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="9759dcd8-b056-4924-9c1f-96ae6cdd2341" containerName="oc" Mar 08 00:43:08 crc kubenswrapper[4713]: I0308 00:43:08.745720 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-7nqh7" Mar 08 00:43:08 crc kubenswrapper[4713]: I0308 00:43:08.754539 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-7nqh7"] Mar 08 00:43:08 crc kubenswrapper[4713]: I0308 00:43:08.879962 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bk4r\" (UniqueName: \"kubernetes.io/projected/2e4ce6f4-6278-444b-baf1-fc8bd41857e9-kube-api-access-5bk4r\") pod \"infrawatch-operators-7nqh7\" (UID: \"2e4ce6f4-6278-444b-baf1-fc8bd41857e9\") " pod="service-telemetry/infrawatch-operators-7nqh7" Mar 08 00:43:08 crc kubenswrapper[4713]: I0308 00:43:08.981478 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bk4r\" (UniqueName: \"kubernetes.io/projected/2e4ce6f4-6278-444b-baf1-fc8bd41857e9-kube-api-access-5bk4r\") pod \"infrawatch-operators-7nqh7\" (UID: \"2e4ce6f4-6278-444b-baf1-fc8bd41857e9\") " pod="service-telemetry/infrawatch-operators-7nqh7" Mar 08 00:43:09 crc kubenswrapper[4713]: I0308 00:43:09.003933 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bk4r\" (UniqueName: \"kubernetes.io/projected/2e4ce6f4-6278-444b-baf1-fc8bd41857e9-kube-api-access-5bk4r\") pod \"infrawatch-operators-7nqh7\" (UID: \"2e4ce6f4-6278-444b-baf1-fc8bd41857e9\") " pod="service-telemetry/infrawatch-operators-7nqh7" Mar 08 00:43:09 crc kubenswrapper[4713]: I0308 00:43:09.078739 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-7nqh7" Mar 08 00:43:09 crc kubenswrapper[4713]: I0308 00:43:09.507468 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 00:43:09 crc kubenswrapper[4713]: I0308 00:43:09.513178 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-7nqh7"] Mar 08 00:43:10 crc kubenswrapper[4713]: I0308 00:43:10.400350 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-7nqh7" event={"ID":"2e4ce6f4-6278-444b-baf1-fc8bd41857e9","Type":"ContainerStarted","Data":"208523ff66310434f5ef6408aab896ded74957af1155c6b05453d12c8f461a5c"} Mar 08 00:43:10 crc kubenswrapper[4713]: I0308 00:43:10.400653 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-7nqh7" event={"ID":"2e4ce6f4-6278-444b-baf1-fc8bd41857e9","Type":"ContainerStarted","Data":"1af2e63be9b7f5c6d70bf485c960db17189bcafb21ed6287d90b04c635002095"} Mar 08 00:43:10 crc kubenswrapper[4713]: I0308 00:43:10.418860 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-7nqh7" podStartSLOduration=2.324716779 podStartE2EDuration="2.418841532s" podCreationTimestamp="2026-03-08 00:43:08 +0000 UTC" firstStartedPulling="2026-03-08 00:43:09.507221643 +0000 UTC m=+2243.626853876" lastFinishedPulling="2026-03-08 00:43:09.601346396 +0000 UTC m=+2243.720978629" observedRunningTime="2026-03-08 00:43:10.41194236 +0000 UTC m=+2244.531574603" watchObservedRunningTime="2026-03-08 00:43:10.418841532 +0000 UTC m=+2244.538473775" Mar 08 00:43:12 crc kubenswrapper[4713]: I0308 00:43:12.159086 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sqqcq"] Mar 08 00:43:12 crc kubenswrapper[4713]: I0308 00:43:12.160717 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:12 crc kubenswrapper[4713]: I0308 00:43:12.174962 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sqqcq"] Mar 08 00:43:12 crc kubenswrapper[4713]: I0308 00:43:12.356222 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fj98\" (UniqueName: \"kubernetes.io/projected/c79fef27-446e-4c6b-be4d-2b2885fa81bf-kube-api-access-4fj98\") pod \"community-operators-sqqcq\" (UID: \"c79fef27-446e-4c6b-be4d-2b2885fa81bf\") " pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:12 crc kubenswrapper[4713]: I0308 00:43:12.356457 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c79fef27-446e-4c6b-be4d-2b2885fa81bf-utilities\") pod \"community-operators-sqqcq\" (UID: \"c79fef27-446e-4c6b-be4d-2b2885fa81bf\") " pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:12 crc kubenswrapper[4713]: I0308 00:43:12.356509 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c79fef27-446e-4c6b-be4d-2b2885fa81bf-catalog-content\") pod \"community-operators-sqqcq\" (UID: \"c79fef27-446e-4c6b-be4d-2b2885fa81bf\") " pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:12 crc kubenswrapper[4713]: I0308 00:43:12.457990 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c79fef27-446e-4c6b-be4d-2b2885fa81bf-utilities\") pod \"community-operators-sqqcq\" (UID: \"c79fef27-446e-4c6b-be4d-2b2885fa81bf\") " pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:12 crc kubenswrapper[4713]: I0308 00:43:12.458035 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c79fef27-446e-4c6b-be4d-2b2885fa81bf-catalog-content\") pod \"community-operators-sqqcq\" (UID: \"c79fef27-446e-4c6b-be4d-2b2885fa81bf\") " pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:12 crc kubenswrapper[4713]: I0308 00:43:12.458133 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fj98\" (UniqueName: \"kubernetes.io/projected/c79fef27-446e-4c6b-be4d-2b2885fa81bf-kube-api-access-4fj98\") pod \"community-operators-sqqcq\" (UID: \"c79fef27-446e-4c6b-be4d-2b2885fa81bf\") " pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:12 crc kubenswrapper[4713]: I0308 00:43:12.458810 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c79fef27-446e-4c6b-be4d-2b2885fa81bf-catalog-content\") pod \"community-operators-sqqcq\" (UID: \"c79fef27-446e-4c6b-be4d-2b2885fa81bf\") " pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:12 crc kubenswrapper[4713]: I0308 00:43:12.459401 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c79fef27-446e-4c6b-be4d-2b2885fa81bf-utilities\") pod \"community-operators-sqqcq\" (UID: \"c79fef27-446e-4c6b-be4d-2b2885fa81bf\") " pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:12 crc kubenswrapper[4713]: I0308 00:43:12.476473 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fj98\" (UniqueName: \"kubernetes.io/projected/c79fef27-446e-4c6b-be4d-2b2885fa81bf-kube-api-access-4fj98\") pod \"community-operators-sqqcq\" (UID: \"c79fef27-446e-4c6b-be4d-2b2885fa81bf\") " pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:12 crc kubenswrapper[4713]: I0308 00:43:12.485794 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:12 crc kubenswrapper[4713]: I0308 00:43:12.971218 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sqqcq"] Mar 08 00:43:13 crc kubenswrapper[4713]: I0308 00:43:13.432118 4713 generic.go:334] "Generic (PLEG): container finished" podID="c79fef27-446e-4c6b-be4d-2b2885fa81bf" containerID="24da6baeea23297ba20a20552bd23b182bbc70f7d5a967124726fe2c78469b25" exitCode=0 Mar 08 00:43:13 crc kubenswrapper[4713]: I0308 00:43:13.432185 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqqcq" event={"ID":"c79fef27-446e-4c6b-be4d-2b2885fa81bf","Type":"ContainerDied","Data":"24da6baeea23297ba20a20552bd23b182bbc70f7d5a967124726fe2c78469b25"} Mar 08 00:43:13 crc kubenswrapper[4713]: I0308 00:43:13.432230 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqqcq" event={"ID":"c79fef27-446e-4c6b-be4d-2b2885fa81bf","Type":"ContainerStarted","Data":"5db300c60241204ebec94f5b2c1edb7c6a67193a9280f3448324bb8f49146b49"} Mar 08 00:43:15 crc kubenswrapper[4713]: E0308 00:43:15.070038 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc79fef27_446e_4c6b_be4d_2b2885fa81bf.slice/crio-46a462f14df217e09ab6a50424fcf2f946e8455c93c30a669f66527eca90e015.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc79fef27_446e_4c6b_be4d_2b2885fa81bf.slice/crio-conmon-46a462f14df217e09ab6a50424fcf2f946e8455c93c30a669f66527eca90e015.scope\": RecentStats: unable to find data in memory cache]" Mar 08 00:43:15 crc kubenswrapper[4713]: I0308 00:43:15.448968 4713 generic.go:334] "Generic (PLEG): container finished" podID="c79fef27-446e-4c6b-be4d-2b2885fa81bf" containerID="46a462f14df217e09ab6a50424fcf2f946e8455c93c30a669f66527eca90e015" exitCode=0 Mar 08 00:43:15 crc kubenswrapper[4713]: I0308 00:43:15.449176 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqqcq" event={"ID":"c79fef27-446e-4c6b-be4d-2b2885fa81bf","Type":"ContainerDied","Data":"46a462f14df217e09ab6a50424fcf2f946e8455c93c30a669f66527eca90e015"} Mar 08 00:43:16 crc kubenswrapper[4713]: I0308 00:43:16.457762 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqqcq" event={"ID":"c79fef27-446e-4c6b-be4d-2b2885fa81bf","Type":"ContainerStarted","Data":"896c89cf785293da14c6a8cd3407a5fa68db55175e00a4a426be4579167c2e82"} Mar 08 00:43:16 crc kubenswrapper[4713]: I0308 00:43:16.481014 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sqqcq" podStartSLOduration=2.028286595 podStartE2EDuration="4.480996767s" podCreationTimestamp="2026-03-08 00:43:12 +0000 UTC" firstStartedPulling="2026-03-08 00:43:13.434986055 +0000 UTC m=+2247.554618288" lastFinishedPulling="2026-03-08 00:43:15.887696227 +0000 UTC m=+2250.007328460" observedRunningTime="2026-03-08 00:43:16.476513568 +0000 UTC m=+2250.596145801" watchObservedRunningTime="2026-03-08 00:43:16.480996767 +0000 UTC m=+2250.600629000" Mar 08 00:43:19 crc kubenswrapper[4713]: I0308 00:43:19.079412 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/infrawatch-operators-7nqh7" Mar 08 00:43:19 crc kubenswrapper[4713]: I0308 00:43:19.080144 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/infrawatch-operators-7nqh7" Mar 08 00:43:19 crc kubenswrapper[4713]: I0308 00:43:19.113055 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/infrawatch-operators-7nqh7" Mar 08 00:43:19 crc kubenswrapper[4713]: I0308 00:43:19.505485 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/infrawatch-operators-7nqh7" Mar 08 00:43:22 crc kubenswrapper[4713]: I0308 00:43:22.347142 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-7nqh7"] Mar 08 00:43:22 crc kubenswrapper[4713]: I0308 00:43:22.486537 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:22 crc kubenswrapper[4713]: I0308 00:43:22.486591 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:22 crc kubenswrapper[4713]: I0308 00:43:22.501371 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/infrawatch-operators-7nqh7" podUID="2e4ce6f4-6278-444b-baf1-fc8bd41857e9" containerName="registry-server" containerID="cri-o://208523ff66310434f5ef6408aab896ded74957af1155c6b05453d12c8f461a5c" gracePeriod=2 Mar 08 00:43:22 crc kubenswrapper[4713]: I0308 00:43:22.564133 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:22 crc kubenswrapper[4713]: I0308 00:43:22.619706 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:22 crc kubenswrapper[4713]: I0308 00:43:22.892475 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-7nqh7" Mar 08 00:43:22 crc kubenswrapper[4713]: I0308 00:43:22.910856 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bk4r\" (UniqueName: \"kubernetes.io/projected/2e4ce6f4-6278-444b-baf1-fc8bd41857e9-kube-api-access-5bk4r\") pod \"2e4ce6f4-6278-444b-baf1-fc8bd41857e9\" (UID: \"2e4ce6f4-6278-444b-baf1-fc8bd41857e9\") " Mar 08 00:43:22 crc kubenswrapper[4713]: I0308 00:43:22.921138 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e4ce6f4-6278-444b-baf1-fc8bd41857e9-kube-api-access-5bk4r" (OuterVolumeSpecName: "kube-api-access-5bk4r") pod "2e4ce6f4-6278-444b-baf1-fc8bd41857e9" (UID: "2e4ce6f4-6278-444b-baf1-fc8bd41857e9"). InnerVolumeSpecName "kube-api-access-5bk4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:43:23 crc kubenswrapper[4713]: I0308 00:43:23.011873 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bk4r\" (UniqueName: \"kubernetes.io/projected/2e4ce6f4-6278-444b-baf1-fc8bd41857e9-kube-api-access-5bk4r\") on node \"crc\" DevicePath \"\"" Mar 08 00:43:23 crc kubenswrapper[4713]: I0308 00:43:23.509149 4713 generic.go:334] "Generic (PLEG): container finished" podID="2e4ce6f4-6278-444b-baf1-fc8bd41857e9" containerID="208523ff66310434f5ef6408aab896ded74957af1155c6b05453d12c8f461a5c" exitCode=0 Mar 08 00:43:23 crc kubenswrapper[4713]: I0308 00:43:23.509218 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-7nqh7" Mar 08 00:43:23 crc kubenswrapper[4713]: I0308 00:43:23.509269 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-7nqh7" event={"ID":"2e4ce6f4-6278-444b-baf1-fc8bd41857e9","Type":"ContainerDied","Data":"208523ff66310434f5ef6408aab896ded74957af1155c6b05453d12c8f461a5c"} Mar 08 00:43:23 crc kubenswrapper[4713]: I0308 00:43:23.509340 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-7nqh7" event={"ID":"2e4ce6f4-6278-444b-baf1-fc8bd41857e9","Type":"ContainerDied","Data":"1af2e63be9b7f5c6d70bf485c960db17189bcafb21ed6287d90b04c635002095"} Mar 08 00:43:23 crc kubenswrapper[4713]: I0308 00:43:23.509370 4713 scope.go:117] "RemoveContainer" containerID="208523ff66310434f5ef6408aab896ded74957af1155c6b05453d12c8f461a5c" Mar 08 00:43:23 crc kubenswrapper[4713]: I0308 00:43:23.539408 4713 scope.go:117] "RemoveContainer" containerID="208523ff66310434f5ef6408aab896ded74957af1155c6b05453d12c8f461a5c" Mar 08 00:43:23 crc kubenswrapper[4713]: E0308 00:43:23.540026 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"208523ff66310434f5ef6408aab896ded74957af1155c6b05453d12c8f461a5c\": container with ID starting with 208523ff66310434f5ef6408aab896ded74957af1155c6b05453d12c8f461a5c not found: ID does not exist" containerID="208523ff66310434f5ef6408aab896ded74957af1155c6b05453d12c8f461a5c" Mar 08 00:43:23 crc kubenswrapper[4713]: I0308 00:43:23.540076 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"208523ff66310434f5ef6408aab896ded74957af1155c6b05453d12c8f461a5c"} err="failed to get container status \"208523ff66310434f5ef6408aab896ded74957af1155c6b05453d12c8f461a5c\": rpc error: code = NotFound desc = could not find container \"208523ff66310434f5ef6408aab896ded74957af1155c6b05453d12c8f461a5c\": container with ID starting with 208523ff66310434f5ef6408aab896ded74957af1155c6b05453d12c8f461a5c not found: ID does not exist" Mar 08 00:43:23 crc kubenswrapper[4713]: I0308 00:43:23.545771 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-7nqh7"] Mar 08 00:43:23 crc kubenswrapper[4713]: I0308 00:43:23.551016 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/infrawatch-operators-7nqh7"] Mar 08 00:43:24 crc kubenswrapper[4713]: I0308 00:43:24.553922 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e4ce6f4-6278-444b-baf1-fc8bd41857e9" path="/var/lib/kubelet/pods/2e4ce6f4-6278-444b-baf1-fc8bd41857e9/volumes" Mar 08 00:43:24 crc kubenswrapper[4713]: I0308 00:43:24.939290 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sqqcq"] Mar 08 00:43:24 crc kubenswrapper[4713]: I0308 00:43:24.940279 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sqqcq" podUID="c79fef27-446e-4c6b-be4d-2b2885fa81bf" containerName="registry-server" containerID="cri-o://896c89cf785293da14c6a8cd3407a5fa68db55175e00a4a426be4579167c2e82" gracePeriod=2 Mar 08 00:43:25 crc kubenswrapper[4713]: I0308 00:43:25.538868 4713 generic.go:334] "Generic (PLEG): container finished" podID="c79fef27-446e-4c6b-be4d-2b2885fa81bf" containerID="896c89cf785293da14c6a8cd3407a5fa68db55175e00a4a426be4579167c2e82" exitCode=0 Mar 08 00:43:25 crc kubenswrapper[4713]: I0308 00:43:25.538896 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqqcq" event={"ID":"c79fef27-446e-4c6b-be4d-2b2885fa81bf","Type":"ContainerDied","Data":"896c89cf785293da14c6a8cd3407a5fa68db55175e00a4a426be4579167c2e82"} Mar 08 00:43:25 crc kubenswrapper[4713]: I0308 00:43:25.904206 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:26 crc kubenswrapper[4713]: I0308 00:43:26.052626 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c79fef27-446e-4c6b-be4d-2b2885fa81bf-utilities\") pod \"c79fef27-446e-4c6b-be4d-2b2885fa81bf\" (UID: \"c79fef27-446e-4c6b-be4d-2b2885fa81bf\") " Mar 08 00:43:26 crc kubenswrapper[4713]: I0308 00:43:26.052746 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fj98\" (UniqueName: \"kubernetes.io/projected/c79fef27-446e-4c6b-be4d-2b2885fa81bf-kube-api-access-4fj98\") pod \"c79fef27-446e-4c6b-be4d-2b2885fa81bf\" (UID: \"c79fef27-446e-4c6b-be4d-2b2885fa81bf\") " Mar 08 00:43:26 crc kubenswrapper[4713]: I0308 00:43:26.052814 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c79fef27-446e-4c6b-be4d-2b2885fa81bf-catalog-content\") pod \"c79fef27-446e-4c6b-be4d-2b2885fa81bf\" (UID: \"c79fef27-446e-4c6b-be4d-2b2885fa81bf\") " Mar 08 00:43:26 crc kubenswrapper[4713]: I0308 00:43:26.054404 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c79fef27-446e-4c6b-be4d-2b2885fa81bf-utilities" (OuterVolumeSpecName: "utilities") pod "c79fef27-446e-4c6b-be4d-2b2885fa81bf" (UID: "c79fef27-446e-4c6b-be4d-2b2885fa81bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:43:26 crc kubenswrapper[4713]: I0308 00:43:26.059376 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c79fef27-446e-4c6b-be4d-2b2885fa81bf-kube-api-access-4fj98" (OuterVolumeSpecName: "kube-api-access-4fj98") pod "c79fef27-446e-4c6b-be4d-2b2885fa81bf" (UID: "c79fef27-446e-4c6b-be4d-2b2885fa81bf"). InnerVolumeSpecName "kube-api-access-4fj98". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:43:26 crc kubenswrapper[4713]: I0308 00:43:26.114586 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c79fef27-446e-4c6b-be4d-2b2885fa81bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c79fef27-446e-4c6b-be4d-2b2885fa81bf" (UID: "c79fef27-446e-4c6b-be4d-2b2885fa81bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:43:26 crc kubenswrapper[4713]: I0308 00:43:26.154727 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c79fef27-446e-4c6b-be4d-2b2885fa81bf-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:43:26 crc kubenswrapper[4713]: I0308 00:43:26.155293 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fj98\" (UniqueName: \"kubernetes.io/projected/c79fef27-446e-4c6b-be4d-2b2885fa81bf-kube-api-access-4fj98\") on node \"crc\" DevicePath \"\"" Mar 08 00:43:26 crc kubenswrapper[4713]: I0308 00:43:26.155328 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c79fef27-446e-4c6b-be4d-2b2885fa81bf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:43:26 crc kubenswrapper[4713]: I0308 00:43:26.552852 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqqcq" event={"ID":"c79fef27-446e-4c6b-be4d-2b2885fa81bf","Type":"ContainerDied","Data":"5db300c60241204ebec94f5b2c1edb7c6a67193a9280f3448324bb8f49146b49"} Mar 08 00:43:26 crc kubenswrapper[4713]: I0308 00:43:26.553195 4713 scope.go:117] "RemoveContainer" containerID="896c89cf785293da14c6a8cd3407a5fa68db55175e00a4a426be4579167c2e82" Mar 08 00:43:26 crc kubenswrapper[4713]: I0308 00:43:26.553063 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:26 crc kubenswrapper[4713]: I0308 00:43:26.578103 4713 scope.go:117] "RemoveContainer" containerID="46a462f14df217e09ab6a50424fcf2f946e8455c93c30a669f66527eca90e015" Mar 08 00:43:26 crc kubenswrapper[4713]: I0308 00:43:26.588140 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sqqcq"] Mar 08 00:43:26 crc kubenswrapper[4713]: I0308 00:43:26.594884 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sqqcq"] Mar 08 00:43:26 crc kubenswrapper[4713]: I0308 00:43:26.608753 4713 scope.go:117] "RemoveContainer" containerID="24da6baeea23297ba20a20552bd23b182bbc70f7d5a967124726fe2c78469b25" Mar 08 00:43:28 crc kubenswrapper[4713]: I0308 00:43:28.549811 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c79fef27-446e-4c6b-be4d-2b2885fa81bf" path="/var/lib/kubelet/pods/c79fef27-446e-4c6b-be4d-2b2885fa81bf/volumes" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515153143104024442 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015153143105017360 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015153136212016504 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015153136212015454 5ustar corecore